Just waiting for it to arrive on linux…
I could care less about gaming FPS performance, I’d much rather AMD cards had even a fraction of the support for open source AI tools that Nvidia currently has.
And that’s fine but this is
c/pcgaming
🙂True, but it’s also a thread about Linux users. Seldom does a Linux user solely use their computer for gaming, though gaming may be the primary driver for a video card upgrade. Not being able to fully utilize emergent technologies, especially with the prices of today’s GPU’s, is a huge downside to purchasing an AMD card.
Fair enough!
That said, being able to use it without any driver shenanigans if you want it for gaming is a huge upside to purchasing AMD.
being able to use it without any driver shenanigans
That’s simply not true, though I keep seeing it repeated.
I ran a 3090 on arch, and I never had any issues with drivers. Albeit I didn’t mess around with Wayland, but that seems like more of a Wayland problem. I have a 7900XTX on arch now, and I swear it’s been far more of a hassle than the 3090 ever was. Both cards seem to work fine with Steam/Proton, though the AMD needed a lot of tweaking to fix various Xwindows issues, not to mention 80% of the AI tools simply won’t run on it.
That’s simply not true, though I keep seeing it repeated.
That is factually true, unless you’re happy with the performance of Nouveau or whatever the OSS Nvidia driver is called.
To get an Nvidia card working you need a proprietary driver. Many distros can get it for you automatically during install but that still doesn’t mean it won’t cause issues. Every time there’s a kernel update, DKMS needs to kick in and recompile the driver for the new kernel. If that fails for any reason, you might end up without a GUI on your next boot. To fix that, assuming it’s even possible, you need some level of savvy. There’s the lack of Wayland support, which is a problem for many, especially with Plasma 6 looming over the horizon and being Wayland-only.
I ran a 3090 on arch, and I never had any issues with drivers
Great! I’m glad your setup works flawlessly for you. Saying it always will for everybody is disingenuous at best though.
Albeit I didn’t mess around with Wayland, but that seems like more of a Wayland problem.
There you go. And no, it’s not a Wayland problem. It’s a lack of Wayland compatibility in the drivers. Fixing that is completely outside of the scope of Wayland.
I have a 7900XTX on arch now, and I swear it’s been far more of a hassle than the 3090 ever was.
I have a 7900XTX on Debian now, and zero hassle. I don’t know what Arch has done to make your life difficult but it shouldn’t have been the case. I have also run Nobara, Fedora and Bazzite on this hardware without a single issue. Well, I did have issues with Bazzite but none related to the GPU.
For my current Debian setup, the only GPU-related “tweaking” I had to do was to copy firmware blobs from the upstream kernel tree because some of them had not yet been packaged for the distro. This is a one-time operation though, and well documented.
the AMD needed a lot of tweaking to fix various Xwindows issues
Granted, I’ve been mostly on Wayland and KDE/Plasma but I have dipped into X every now and then for performance comparison. Never had a single driver issue.
Not saying your experience is invalid in any way but it shouldn’t really be the case. It might be some quirk of Arch.
not to mention 80% of the AI tools simply won’t run on it
Yeah we established AI is a problem. It’s irrelevant to gaming though, which is what we’re discussing. I wonder if whatever tweaks and hacks you had to do to your AMD Radeon system to use AI on it might be part of the causes of your problems. No idea, honestly - but again, irrelevant.
Every time there’s a kernel update, DKMS needs to kick in and recompile the driver for the new kernel. If that fails for any reason, you might end up without a GUI on your next boot.
Ah yes, the FUD bogeyman.
DKMS has been around for over 20 years, “But it might fail! I have anecdotal evidence!” I mean if that’s the case, we should probably stop using cell phones and laptops because batteries are known to swell and explode, right?
[2023-08-09T17:44:06-0600] [ALPM] upgraded nvidia (535.86.05-8 -> 535.98-1) [2023-08-12T11:34:32-0600] [ALPM] upgraded nvidia (535.98-1 -> 535.98-2) [2023-08-19T04:37:07-0600] [ALPM] upgraded nvidia (535.98-2 -> 535.98-3) [2023-08-19T17:05:46-0600] [ALPM] upgraded nvidia (535.98-3 -> 535.98-4) [2023-08-23T23:43:09-0600] [ALPM] upgraded nvidia (535.98-4 -> 535.104.05-1) [2023-08-25T03:23:33-0600] [ALPM] upgraded nvidia (535.104.05-1 -> 535.104.05-2) [2023-09-10T02:51:42-0600] [ALPM] upgraded nvidia (535.104.05-2 -> 535.104.05-5) [2023-09-13T07:29:48-0600] [ALPM] upgraded nvidia (535.104.05-5 -> 535.104.05-6) [2023-09-22T10:21:43-0600] [ALPM] upgraded nvidia (535.104.05-6 -> 535.104.05-7) [2023-09-25T02:37:05-0600] [ALPM] upgraded nvidia (535.104.05-7 -> 535.113.01-1) [2023-09-26T19:27:38-0600] [ALPM] upgraded nvidia (535.113.01-1 -> 535.113.01-2) [2023-10-07T21:30:02-0600] [ALPM] upgraded nvidia (535.113.01-2 -> 535.113.01-4) [2023-10-12T10:02:53-0600] [ALPM] upgraded nvidia (535.113.01-4 -> 535.113.01-5) [2023-10-22T23:45:35-0600] [ALPM] upgraded nvidia (535.113.01-5 -> 535.113.01-6) [2023-10-26T09:21:52-0600] [ALPM] upgraded nvidia (535.113.01-6 -> 535.113.01-8) [2023-11-05T16:29:19-0700] [ALPM] upgraded nvidia (535.113.01-8 -> 545.29.02-2) [2023-11-14T00:42:25-0700] [ALPM] upgraded nvidia (545.29.02-2 -> 545.29.02-4) [2023-11-23T14:14:47-0700] [ALPM] upgraded nvidia (545.29.02-4 -> 545.29.02-5) [2023-11-24T09:02:20-0700] [ALPM] upgraded nvidia (545.29.02-5 -> 545.29.06-1) [2023-12-01T07:18:54-0700] [ALPM] upgraded nvidia (545.29.06-1 -> 545.29.06-2) [2023-12-11T19:54:24-0700] [ALPM] upgraded nvidia (545.29.06-2 -> 545.29.06-5) [2023-12-20T22:16:01-0700] [ALPM] upgraded nvidia (545.29.06-5 -> 545.29.06-6) [2023-12-25T07:04:48-0700] [ALPM] upgraded nvidia (545.29.06-6 -> 545.29.06-7) [2024-01-03T15:14:06-0700] [ALPM] upgraded nvidia (545.29.06-7 -> 545.29.06-8)
“But it’s proprietary!” So you have an issue with using a proprietary driver, but you’re perfectly fine with running Steam and the thousands of proprietary games on Linux?
AMD just works? Well sure, but then you reference some obscure blog post as “well documented”, at the same time saying a DKMS driver install might fail and it might not be possible to resolve, as if booting off a rescue USB and downgrading packages isn’t a well documented procedure.
same!
Copy…like you mean how followed followed Nvidias already existing tech as per the article…
Nvidia has had its own frame generation tech in DLSS 3 for over a year now
Also in the article (and in fact, core to the point being made):
There are a couple of catches, though. For one, it only works on Nvidia GeForce RTX 4000-series GPUs, such as the GeForce RTX 4070. Secondly, using it requires a game to support it, or for a modder to unofficially add support to a game.
What’s more, while Nvidia’s DLSS tech is a closed eco system that only runs on Nvidia RTX GPUs, AMD has made FSR 3 open source, a strategy that AMD thinks Nvidia will have to emulate at some point.
FSR running on hardware from any manufacturer is a huge boon. DLSS is impressive, but I’m not about to lock myself into a manufacturer ecosystem, and especially not one like nvidia.
FSR has always been open source
Nvidia just refuses to use it
It’s not like they’re blocking their cards from running it.
They’re not replacing DLSS with it because it’s a huge downgrade.
Fsr 1 Nvidia would have to make work
FSR 2 AMD made work but Nvidia would have to allow it
FSR 3 Nvidia no longer has a choice
Nvidia doesn’t care.
They’re just not wasting resources working on it because it’s not remotely comparable to DLSS. The quality isn’t comparable and the resource use isn’t comparable (because the entire point of DLSS is that Nvidia added separate hardware to do a far better, far more efficient job at it). Why would they go back and add something that’s just doing a worse job copying their tech?
Since it is open source they could push their optimizations upstream and having both companies working on a uniform solution is better for everyone
(I guess not better for Nvidia’s monopoly since they have worse cards)
No, they could not. AMD cards don’t have any of the hardware to execute the same or similar operations. Executing the same code without tensor cores to accelerate them would tank performance, which is the entire reason you get less performance gain for far worse image quality with FSR in the first place.
The literal only thing AMD’s hardware is competitive at is raw traditional raster performance, because Nvidia has better designs that leverage hardware features to accelerate portions of the ray tracing and upscaling workloads much more efficiently.
AMD is trying to copy hardware features with software, and it’s not comparable.