• drosophila@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    5
    ·
    edit-2
    1 month ago

    I’m going to sound a little pissy here but I think most of what’s happening is that console hardware was so limited for such a long time that PC gamers got used to being able to max out their settings and still get 300 FPS.

    Now that consoles have caught up and cranking the settings actually lowers your FPS like it used to people are shitting themselves.

    If you don’t believe me then look at these benchmarks from 2013:

    https://pcper.com/2013/02/nvidia-geforce-gtx-titan-performance-review-and-frame-rating-update/3/

    https://www.pugetsystems.com/labs/articles/review-nvidia-geforce-gtx-titan-6gb-185/

    Look at how spikey the frame time graph was for Battlefield 3. Look at how, even with triple SLI Titans, you couldn’t hit a consistent 60 FPS in maxed Hitman Absolution.

    And yeah, I know high end graphics cards are even more expensive now than the Titan was in 2013 (due to the ongoing parade of BS that’s been keeping GPU prices high), but the systems in those reviews are close to the highest end hardware you could get back then. Even if you were a billionaire you weren’t going to be running Hitman much faster (you could put one more Titan in SLI, which had massively diminishing returns, and you could overclock everything maybe).

    If you want to prioritize high and consistent framerate over visual fidelity / the latest rendering tech / giant map sizes then that’s fine, but don’t act like everything was great until a bunch of idiots got together and built UE5.

    EDIT: the shader compilation stuff is an exception. Games should not be compiling shaders during gameplay. But that problem isn’t limited to UE5.

    • WereCat@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      1 month ago

      The issue is not that the games performance requirements at reasonable graphics settings is absolutely destroying modern HW. The issue is that once you set the game to low settings it still performs like shit while looking worse than a 10y old games

    • ILikeBoobies
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 month ago

      EDIT: the shader compilation stuff is an exception. Games should not be compiling shaders during gameplay. But that problem isn’t limited to UE5.

      You can preload them if you want but that leads to loadscreens. It’s a developer issue not an Unreal one

      • The_Decryptor@aussie.zone
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 month ago

        No matter what you’ve got to compile the shaders, either on launch or when needed. The game should be caching the results of that step though, so the next time it’s needed it can be skipped entirely.

        • ILikeBoobies
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 month ago

          Gpus do cache them

          That’s why on launch/loading screens work

  • verdigris@lemmy.ml
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    2
    ·
    1 month ago

    I don’t agree with this at all. I’m sure there are projects where it wasn’t a great choice, but I’ve had no consistent problems with UE5 games, and in several cases the games look and feel better after switching – Satisfactory is a great example.

    • RightHandOfIkaros@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      edit-2
      1 month ago

      Dead by Daylight switched to UE5 and immediately had noticably bad performance.

      Silent Hill 2 Remake is made in UE5 and also has bad performance stuttering. Though Bloober is famously bad at optimization so its possible it might be just Bloober being Bloober.

      STALKER 2 is showing some questionable performance issues for even high end PCs, and that is also made in UE5.

      Now, just because the common denominator for all these examples is UE5 doesn’t mean that UE5 is the cause, but it is certainly quite the coincidence that the common denominator is the same in all these examples.

      • a1studmuffin@aussie.zone
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        2
        ·
        1 month ago

        It’s the responsibility of the game developer to ensure their game performs well, regardless of engine choice. If they release a UE5 game that suffers from poor performance, that just means they needed to spend more time profiling and optimising their game. UE5 provides a mountain of tooling for this, and developers are free to make engine-side changes as it’s all open source.

        Of course Epic should be doing what they can to ensure their engine is performant out of the box, but they also need to keep pushing technology forward, which means things may run slower on older hardware. They don’t define a game’s minspec hardware, the developer does.

  • simple@lemm.ee
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    1
    ·
    1 month ago

    I’ve seen a lot of talented devs explain that UE5 does give devs the tools to pre-cache shaders but since AAA studios rush everything, it ends up being low priority compared to maximizing the graphics. It’s not hard to believe considering games are pushed out the door with game-breaking bugs nowadays.

    But it does beg the question of why the engine doesn’t do that itself. UE4 games ran like a dream, but this generation has felt like nothing but stuttering and 20 minutes of compiling shaders every time you open a game for the first time…

  • S_H_K@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    25
    ·
    1 month ago

    I think the main problem is how the industry became a crunching machine. Unreal had been sold as on size fits all solution whereas there is things it does good and others it doesn’t obviously.

  • MangoPenguin@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    15
    ·
    edit-2
    1 month ago

    I rarely have a good time with UE4/UE5 games, performance is often rough and while on a technical level the graphics are ‘better’, I often don’t think they look as pleasant or feel as immersive as older games.

  • givesomefucks@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    2
    ·
    1 month ago

    Hugely disappointed in Stalker 2…

    But after that article I’ll give it another shot sooner than I was going to. I never thought that horrible performance could have been shaders loading in the background.

    If that’s what was going on, then they really need to make that more obvious, or lock people in a sort of training area until it’s done and then start the actual game.

    A couple weeks and it’ll probably be a lot better.

    But initial thoughts before the article, I think the mistake was watching huge budget games designed from the ground up to be a showcase for the engine, and assuming that would be what any third party studio could crank out.

    UE5 has amazing potential, but it still needs good code run on good hardware to get Selene’s result.

    • Comment105@lemm.ee
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 month ago

      Wasn’t this the game developed under siege for a while, then the studio fled to set up in another country?

      • givesomefucks@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        1 month ago

        That’s what I mean.

        Everybody had unrealistic expectations, myself included.

        My PC isn’t a slouch, but everybody who got early play has top of the line shit and there’s a large discrepancy in PC hardware these days.

        Apparently it’s not shaders, but I had to check what resolution it was at thinking it was throwing 720 by default or something. With everything cranked to 4k and only the normal performance hogs off the highest settings it looked bad. 1080p with everything down still had stuttering tho.

        I didn’t put much effort in and my experience was launch day.

        So people should definitely try for themselves if they have it from Xbox for PC for free…

        I just expected it to be amazing on boot when I shouldn’t have.

        • deranger@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          edit-2
          1 month ago

          I ran it on default recommended settings (High, 3440x1440) and it’s smoother than any of the originals were, even after they had years of patches. I experience some mild stuttering when I approach a hub area with lots of NPCs but it’s not terrible. I can’t really complain. 3090 and 5800X3D.

          Pretty fun for me so far. There’s some weirdness with dudes spawning too close and A-Life AI seems to be missing but I’m enjoying the zone so far after 15 hours or so.

          • givesomefucks@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            2
            ·
            1 month ago

            3090 and 5800X3D.

            Yeah. I’m 4070 super and 7800x3d

            Like I said, I went in expecting it to look like Senue’s 2 on boot. And there was just no reason for me to have done that.

            I’ll give it a month or so and then mess with settings/drivers/etc and it’ll probably be fine. It’s just even when I tried turning stuff down I was having issues, but I haven’t put a lot of effort into getting it right.

            Just because the engine is capable of crazy stuff, doesn’t mean every game will push it to its full potential, and that’s fine. That’s how engines last for a long time and that’s good for all of us in the long run.

            • Coelacanth@feddit.nu
              link
              fedilink
              English
              arrow-up
              4
              ·
              1 month ago

              They’re definitely not pushing the engine to its limits and it’s a shame. No Ray Reconstruction for example and no hardware ray tracing. I was wondering why shadows and reflections lacked clarity at first. This is apparently why.

              It’s a weird one though because despite all the flaws I can’t stop playing the game. Maybe I just love STALKER that much. I also have a bunch of mods installed, granted.

    • Murvel@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      1 month ago

      The game does not suffer shader compilation stutters. Rather, it’s heavily CPU limited for whatever reason.

    • SquishyPandaDev@yiffit.net
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      1 month ago

      I’ve seen a lot of complaints online about the game’s performance. But my ‘okay’ computer is handling the game at max settings just fine. I’m kinda of confused. Is it because I’m using Linux?

      • givesomefucks@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        1 month ago

        But my ‘okay’ computer is handling the game at max settings just fine.

        Yeah, that’s the issue.

        Your comp is running maxed setting at what you consider a serviceable framerate, while admitting your PC is just “okay”.

        Everyone with a better comp than you, is also running at max setting, and seeing the graphics you are at probably close to the same average frames and dips. But we’re used to better graphics at higher frame rates with zero stutter/dips.

        I’ve talked about this issue in the past, and it’s hard to explain. But a properly optimized game shouldn’t really run with everything maxed out on release except the very top hardware setup.

        What’s currently max setting should be “medium” settings, because lots of people can handle it.

        Your experience wouldn’t change at all, there’d just be the higher graphical settings available for people who could run them.

        Think of it like buying the game on PS5 Pro, and then finding out that it plays exactly the same on the PS4. It’s not that you’d be mad that the PS4 people get a playable version, it’s that you don’t understand why that’s comparable to the newest gen console version. And compared to games that use your PS5 pro’s full power, it’s going to seem bad.

        People (myself included) just assumed since it was UE5, they’d be at least giving us the options that UE5 was updated to support.

        It seems they did it for future proofing the game, which 100% makes sense. Hopefully they add that stuff in with updates later.

        Like, it doesn’t support hardware ray tracing…

        And it doesn’t have non ray based lighting either. It forces everything to software ray tracing, which is a huge performance hit to people with hardware that can do ray tracing, but is completely unnoticeable to people with hardware that can’t do ray tracing. They may even see better graphics than a game that uses traditional lighting.

        Like. I’m just a hobbyist nerd, I don’t really know all the in and outs of what’s going on with Stalker 2. But it seems like this is just a game that caters to the average PC gamer to the point everyone with an above average PC wasn’t even an afterthought.

        I’m sure there’s going to be a lot of people who know more than me looking at lot closer at why the reaction to this game has been so varied.

      • aiden@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 month ago

        I’m using Linux with a Radeon 7900 XTX and I can’t get over 120 fps

    • Nihilistra@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 month ago

      5800x3d/6800xt/32gb all overclocked/undervolted playing on 1440p with dlsss and fsr Enabled. My settings are a combo of epic and high.

      I didnt encounter gamebreaking bugs and hover around 130 fps with dips to 80 in heavy weather. No macros or micros.

      The only complaints I have is the spawning system combined with weak enemy ai and my expectation that the factions would be fighting more with each other cause that is what I’ve seen and enjoyed in the modded stalker games before.

      I really like the game how it is, looks amazing and the atmosphere is top notch.

  • Lucy :3@feddit.org
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    1 month ago

    Most games made in UE are AAA games, where every A stands for more scam, jankyness and less value overall. Very rushed, no love, made to barely work on “my machine” (4090). Many Unity games are smaller cash grabs.

    The most devs that fulfill at least one criterium well (eg. Gameplay, Performance, Stability) are either small studios with their own engine (4AGames, Croteam, Minecraft), or publishers with one banger per 5 years or so: Valve (lost it with CS2 tho), Rockstar. Because those devs either put love, time or both into the games.

  • MonkderVierte@lemmy.ml
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 month ago

    means you won’t do great, as shaders are being fully reloaded all over again, and all that heavy lifting is being done on the go.

    Lukily, using DXVK, Vulkan caches them beforehand.