So a user on Reddit (ed: u/Yoraxx ) posted on the Starfield subreddit that there was a problem in Starfield when running the game on an AMD Radeon GPU. The issue is very simple, the game just won’t render a star in any solar system when you are at the dayside of a moon or even any planetary object. The issue only occurs on AMD Radeon GPUs and users with the Radeon RX 7000 & RX 6000 GPUs are reporting the same thing.
The issue is that the dayside of any planetary body or moon needs a source of light that gets it all lit up. That source is the star and on any non-AMD GPU, you will see the star/sun in the sky box which will be illuminating light on the surface. But with AMD cards, the star/sun just isn’t there while the planet/moon remains illuminated without any light source.
I have a suspicion that developers do less testing, optimization, and bugfixing for AMD cards due to reduced market share and that’s why more of these brand-specific coding errors slip through for them. It’s unfortunate but I can’t deny I’ve seen some weird bugs in my time.
How can an AMD sponsored game that litteraly runs better on all AMD GPU vs their NVIDIA counterpart, doesn’t embark any tech that may unfavor AMD GPU can be less QA-ed on AMD GPUs because of market share?
This game IS better optimized on AMD. It has FSR2 enabled by default on all graphics presets. That particular take especially doesn’t work for this game.
Some games are built specifically for AMD from the ground up and have been optimized like crazy. Depends on the game and the devs mostly. And let’s not forget that if devs want it to run well on PS5 and Xbox Series x/s, then they better have good AMD optimization.
Oh of course. I don’t actually blame AMD for those kinds of bugs. But it’s the reality as a user, at least in my experience… but it’s been like stupid long time since I’ve used a machine with an AMD card.
And, being Bethesda, it’s not like bugs are unexpected.