- cross-posted to:
- [email protected]
- pcgaming
- cross-posted to:
- [email protected]
- pcgaming
It sucks because we collectively can only vote with our wallets so much as consumers. China spent $1b on Nvidia GPUs for AI development.
You’re looking at roughly 625,000x 4090s.
I’m skeptical. There’s a lot of potential for market segmentation, and GPU vendors have an interest in doing so.
Yes, there are going to be some AI companies who will use consumer GPUs if they’re cheap enough. But the GPU companies can restrict the amount of memory that they put on GPUs; a lot of AI applications have very heavy VRAM requirements. And I am pretty confident that the GPU companies can intentionally find and leave functionality out on consumer cards that would be useful for AI. They have a long history of finding features to segment up the pro CAD market (less price-sensitive) from the 3D gamer market. I remember discovering years back, with some disappointment, that antialiased lines were something that they considered on the pro CAD side, since viewing wireframe models was important for CAD work, so intentionally had poor support for on the game side (I liked the aesthetic of the few antialised line games out there and was a bit disappointed that this was where the divider went).
AI in general has begun creeping into the games industry via AI generated art and voice acting, much to the dismay of creators and average gamers.
Maybe to Metro’s dismay, but not to my dismay. I’d rather have people have as many tools as possible to produce content. One particular area – I am going to be interested to see re-releases of older titles with AI-upscaled high-resolution textures, for one – that is an incredibly arduous process to do by hand, but AI upscalers are pretty impressive. Might take human tweaking too, but I was boggled what can be done with very little work and said upscalers.