I’m going to sound a little pissy here but I think most of what’s happening is that console hardware was so limited for such a long time that PC gamers got used to being able to max out their settings and still get 300 FPS.
Now that consoles have caught up and cranking the settings actually lowers your FPS like it used to people are shitting themselves.
If you don’t believe me then look at these benchmarks from 2013:
https://pcper.com/2013/02/nvidia-geforce-gtx-titan-performance-review-and-frame-rating-update/3/
https://www.pugetsystems.com/labs/articles/review-nvidia-geforce-gtx-titan-6gb-185/
Look at how spikey the frame time graph was for Battlefield 3. Look at how, even with triple SLI Titans, you couldn’t hit a consistent 60 FPS in maxed Hitman Absolution.
And yeah, I know high end graphics cards are even more expensive now than the Titan was in 2013 (due to the ongoing parade of BS that’s been keeping GPU prices high), but the systems in those reviews are close to the highest end hardware you could get back then. Even if you were a billionaire you weren’t going to be running Hitman much faster (you could put one more Titan in SLI, which had massively diminishing returns, and you could overclock everything maybe).
If you want to prioritize high and consistent framerate over visual fidelity / the latest rendering tech / giant map sizes then that’s fine, but don’t act like everything was great until a bunch of idiots got together and built UE5.
EDIT: the shader compilation stuff is an exception. Games should not be compiling shaders during gameplay. But that problem isn’t limited to UE5.
I’ve seen a lot of talented devs explain that UE5 does give devs the tools to pre-cache shaders but since AAA studios rush everything, it ends up being low priority compared to maximizing the graphics. It’s not hard to believe considering games are pushed out the door with game-breaking bugs nowadays.
But it does beg the question of why the engine doesn’t do that itself. UE4 games ran like a dream, but this generation has felt like nothing but stuttering and 20 minutes of compiling shaders every time you open a game for the first time…
20 minutes of compiling shaders every time you open a game for the first time…
Shiiit, Stalker 2 be compiling shaders every time I launch it!
I don’t agree with this at all. I’m sure there are projects where it wasn’t a great choice, but I’ve had no consistent problems with UE5 games, and in several cases the games look and feel better after switching – Satisfactory is a great example.
Dead by Daylight switched to UE5 and immediately had noticably bad performance.
Silent Hill 2 Remake is made in UE5 and also has bad performance stuttering. Though Bloober is famously bad at optimization so its possible it might be just Bloober being Bloober.
STALKER 2 is showing some questionable performance issues for even high end PCs, and that is also made in UE5.
Now, just because the common denominator for all these examples is UE5 doesn’t mean that UE5 is the cause, but it is certainly quite the coincidence that the common denominator is the same in all these examples.
It’s the responsibility of the game developer to ensure their game performs well, regardless of engine choice. If they release a UE5 game that suffers from poor performance, that just means they needed to spend more time profiling and optimising their game. UE5 provides a mountain of tooling for this, and developers are free to make engine-side changes as it’s all open source.
Of course Epic should be doing what they can to ensure their engine is performant out of the box, but they also need to keep pushing technology forward, which means things may run slower on older hardware. They don’t define a game’s minspec hardware, the developer does.
Subnautica 2 is going to be UE5 also, I’m already worried about it.
I think the main problem is how the industry became a crunching machine. Unreal had been sold as on size fits all solution whereas there is things it does good and others it doesn’t obviously.
Hugely disappointed in Stalker 2…
But after that article I’ll give it another shot sooner than I was going to. I never thought that horrible performance could have been shaders loading in the background.
If that’s what was going on, then they really need to make that more obvious, or lock people in a sort of training area until it’s done and then start the actual game.
A couple weeks and it’ll probably be a lot better.
But initial thoughts before the article, I think the mistake was watching huge budget games designed from the ground up to be a showcase for the engine, and assuming that would be what any third party studio could crank out.
UE5 has amazing potential, but it still needs good code run on good hardware to get Selene’s result.
Wasn’t this the game developed under siege for a while, then the studio fled to set up in another country?
That’s what I mean.
Everybody had unrealistic expectations, myself included.
My PC isn’t a slouch, but everybody who got early play has top of the line shit and there’s a large discrepancy in PC hardware these days.
Apparently it’s not shaders, but I had to check what resolution it was at thinking it was throwing 720 by default or something. With everything cranked to 4k and only the normal performance hogs off the highest settings it looked bad. 1080p with everything down still had stuttering tho.
I didn’t put much effort in and my experience was launch day.
So people should definitely try for themselves if they have it from Xbox for PC for free…
I just expected it to be amazing on boot when I shouldn’t have.
I ran it on default recommended settings (High, 3440x1440) and it’s smoother than any of the originals were, even after they had years of patches. I experience some mild stuttering when I approach a hub area with lots of NPCs but it’s not terrible. I can’t really complain. 3090 and 5800X3D.
Pretty fun for me so far. There’s some weirdness with dudes spawning too close and A-Life AI seems to be missing but I’m enjoying the zone so far after 15 hours or so.
3090 and 5800X3D.
Yeah. I’m 4070 super and 7800x3d
Like I said, I went in expecting it to look like Senue’s 2 on boot. And there was just no reason for me to have done that.
I’ll give it a month or so and then mess with settings/drivers/etc and it’ll probably be fine. It’s just even when I tried turning stuff down I was having issues, but I haven’t put a lot of effort into getting it right.
Just because the engine is capable of crazy stuff, doesn’t mean every game will push it to its full potential, and that’s fine. That’s how engines last for a long time and that’s good for all of us in the long run.
They’re definitely not pushing the engine to its limits and it’s a shame. No Ray Reconstruction for example and no hardware ray tracing. I was wondering why shadows and reflections lacked clarity at first. This is apparently why.
It’s a weird one though because despite all the flaws I can’t stop playing the game. Maybe I just love STALKER that much. I also have a bunch of mods installed, granted.
The game does not suffer shader compilation stutters. Rather, it’s heavily CPU limited for whatever reason.
I’ve seen a lot of complaints online about the game’s performance. But my ‘okay’ computer is handling the game at max settings just fine. I’m kinda of confused. Is it because I’m using Linux?
But my ‘okay’ computer is handling the game at max settings just fine.
Yeah, that’s the issue.
Your comp is running maxed setting at what you consider a serviceable framerate, while admitting your PC is just “okay”.
Everyone with a better comp than you, is also running at max setting, and seeing the graphics you are at probably close to the same average frames and dips. But we’re used to better graphics at higher frame rates with zero stutter/dips.
I’ve talked about this issue in the past, and it’s hard to explain. But a properly optimized game shouldn’t really run with everything maxed out on release except the very top hardware setup.
What’s currently max setting should be “medium” settings, because lots of people can handle it.
Your experience wouldn’t change at all, there’d just be the higher graphical settings available for people who could run them.
Think of it like buying the game on PS5 Pro, and then finding out that it plays exactly the same on the PS4. It’s not that you’d be mad that the PS4 people get a playable version, it’s that you don’t understand why that’s comparable to the newest gen console version. And compared to games that use your PS5 pro’s full power, it’s going to seem bad.
People (myself included) just assumed since it was UE5, they’d be at least giving us the options that UE5 was updated to support.
It seems they did it for future proofing the game, which 100% makes sense. Hopefully they add that stuff in with updates later.
Like, it doesn’t support hardware ray tracing…
And it doesn’t have non ray based lighting either. It forces everything to software ray tracing, which is a huge performance hit to people with hardware that can do ray tracing, but is completely unnoticeable to people with hardware that can’t do ray tracing. They may even see better graphics than a game that uses traditional lighting.
Like. I’m just a hobbyist nerd, I don’t really know all the in and outs of what’s going on with Stalker 2. But it seems like this is just a game that caters to the average PC gamer to the point everyone with an above average PC wasn’t even an afterthought.
I’m sure there’s going to be a lot of people who know more than me looking at lot closer at why the reaction to this game has been so varied.
I’m using Linux with a Radeon 7900 XTX and I can’t get over 120 fps
People are finally catching on.