A little googling shows the S80 has (had?) really bad drivers that caused it to be outmatched by much weaker GPUs, so the 120% number isn’t as crazy as you might first expect.
Hopefully we get more competitive GPUs in the future. On paper that’s already a pretty decent card (16 GB too!)
Now if only they could compete with Nvidia on a performance level, it might give Nvidia the kick in the ass it needs to not be so anti consumer
I think NVidia is already getting a kick in the ass.
The first GPU I bought was a GTX 1060 with 6GB. A legendary card I kept using until just last year November.
What did I upgrade to?
Why Intel of course. The A770 is cheaper than a AMD of the same performance range, and has a weird quirk where it actually does better at 1440p than similar cards. Very likely the spacious VRAM, which is also nice to have for the 3D work I do.
I didn’t upgrade past the 1060 earlier because the 20 series wasn’t that big enough of a leap, and the 30 series is where a lot of Nvidia’s bullshit started.
And for the industrial market $ per performance is all that matters because in large deployments there is no issue with just parallelizing as many GPUs as you want. Even if an intel GPU for a 10th of the price has a 5th of the performance, then you just slap together 5 of them and get the same processing power for half the price.
Both Intel and AMD are trying to eat into Nvidia’s market share, and are arguably failing at that currently. Even though both AMD and Intel have cards that are better than Nvidia’s in specific cases, Nvidia keeps their market share, most likely due in part to CUDA and DLSS being locked to Nvidia.
Can someone tell me about these cards? I’ve literally never heard of them before now. Obviously they’re not big performers, but what are they like?
I don’t normally condone watching Linus Tech Tips but they have a good video on the Moore Threads GPU’s: https://www.youtube.com/watch?v=YGhfy3om9Ok
Also GN is a better source: https://www.youtube.com/watch?v=qPMptuEokPQ
They’re basically home-grown Chinese silicon that have pretty good looking raw performance specs at the cost of high power consumption, but have the most atrocious hacked together drivers that made them effectively useless. They’re definitely getting better as they learn how to write driver code for most games.
More competition is good news and from what I’ve seen, these GPUs offer good value for the price. NVidia has been too stingy with VRAM and I hope that other companies finally begin to step up.