• uzi
    link
    fedilink
    arrow-up
    11
    arrow-down
    4
    ·
    2 months ago

    Game developers should never use upscaling, develop everything at native 4K with 4K ultra textures, and for playing at lower resolutions the game can remove pixels for lower resolution. Similiar to how 4K recorded video looks better at 1080p than 1080p recorded video.

    For framerate and GPU performance, we are now in the 4K capable GPU market and only getting better every 2 years. In 2026/2027 people will NOT require the top GPU for native 4K 120fps which means 4K GPUs are only going to get cheaper and cheaper, not for price but can start pairing a lower GPU with a 4K screen.

    Upscaling is not available on older games, some of those are more popular in terms of favourites than brand new releases, and upscaling is inconsistent across games that have it.

    The smoothest performance across a random 20 to 30 games will always be to only play at native resolution with no frame generation technology.

    • RightHandOfIkaros@lemmy.world
      link
      fedilink
      arrow-up
      11
      arrow-down
      1
      ·
      edit-2
      2 months ago

      But if I use upscaling, I don’t have to optimize my game and can do less work and charge more money!

      ~ Average 2024 Game Developer

      • uzi
        link
        fedilink
        arrow-up
        5
        ·
        2 months ago

        That’s exectly the reason for my post.

    • frankyboi
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      2 months ago

      AFAIK we already develop games with maximum quality assets and scale down to meet average gaming machine. Most textures used are at least 16K with billions of polygons in a single character. The upscaling tech is used on the gamer machine not on the developer machine.

      The game engine already dynamically optimize mesh and textures by scaling down resolution and loading low mesh count assets. But that is still not enough scale down for most gamer machines.

      Frame generation is not a problem in the gamer side. Its all perception.

  • givesomefucks@lemmy.world
    link
    fedilink
    arrow-up
    5
    arrow-down
    2
    ·
    2 months ago

    Frame generation is fucking huge.

    Especially since it works best at high frame rates. Like, if you were playing 30fps doubling to 60 might be a perceptable difference because of how long it is in between frames.

    But going from 60 to 120, it’s still 50% “fake” frames, but the time between “real screens” is much smaller allowing for more frequent corrections to what the “fake” frames are predicting.

    So while it won’t help a bad computer run anything, it can help a “mid” computer make what it can run look a lot better, because you can crank up a bunch of options to maintain the fps you were getting without I.

    • frankyboi
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      I play cyberpunk 2077 at 130 fps on overdrive at 1440p and its awesome and smooth with a RTX 4070. I play on a hisense 55" tv at 11 feet on my sofa and I don’t notice any artifacts. Its just way better than classic SSAO and other global illumination “hacks” that raster uses.