Nvidia is the one company making a fortune in the AI bubble and will be just fine when it pops. They make top-quality graphics processors that physically exist, not money-burning AI models. But CES…
Im sure the Gamers (capital g intentional, derogatory) will love it. And then act really confused when games feel unresponsive, weird and floaty, and blame this on the devs and not their graphics card which is hallucinating most of the frames.
Gamers, for all their faults, have been pretty consistently okay on generative AI, at least in the cases I’ve seen. It doesn’t hurt that nVidia keeps stapling features like this into hardware that supposedly improves performance but at the cost of breaking things and/or requiring more work from devs that are already being run ragged.
Also, I can almost guarantee that the neural texture stuff they’re talking about won’t see enough use from developers to actually see improvements. Let’s do a bunch more work to maybe get some memory savings on some of the highest-end hardware!
That sadly has not been my experience. I recall people on some gaming reddits being all ‘you cannot stop the coming tide’ re AI. And also recently got an indie game after nobody mentioned that the dev was using AI art for it (esp the recent updates, it also really sucks the art I mean) in the thread about it. But yes some of them are good on it, not a single entity after all
Erannorth Chronicles the problem is esp the last two dlc, which suddenly have graphics of the cards who don’t match. I tried to upload a pretty glaring example but it didn’t work atm. But you can also see it at the main page of the game, compare the page with all the cards on it. With this image, quite sad, esp as soon as you start to notice it (and how often the AI art doesn’t make sense) it just pulls you out of the game.
Ah damn, I have it and enjoyed it a lot - it’s fun! The card assets have always been obviously generic, store-bought assets… but at least they were still made by humans. I remember the first AI controversy in that game, I think it was centered around the main menu’s background image, where the castle turrets and towers kinda didn’t line up.
I don’t know how I feel about this, on one hand this is such an incredibly niche product that I can understand wanting to save every scrap of money, but it sets a bad precedent going forward.
About that bad precedent, the next game. And yes the game is fun, in a niche way. It is sad the ai art draws everything down. It always has a weird tonal mismatch with the rest and the names of the cards, and it is wet.
I appreciate tools like FSR and DLSS but wish they were a value proposition for those of us who can’t afford to buy a Xty ninenty every 3 years, as opposed to being pretty much mandatory to prop up the useability of increasingly poorly optimized games.
oh dont’ worry - they prop up the appearance not the usability. They look smoother but user input is still sampled at the slower rate, so the games aren’t actually any more reactive.
I guess in most games where millisecond reactions are necessary you probably aren’t doing much scenery gazing in the first place and can switch DLSS off without missing much (but you’ll have to pay extra for it anyway).
You’re thinking of frame gen, FSR and DLSS do produce more real frames which can make some games useable when otherwise not (or useable at slightly prettier settings)
Im sure the Gamers (capital g intentional, derogatory) will love it. And then act really confused when games feel unresponsive, weird and floaty, and blame this on the devs and not their graphics card which is hallucinating most of the frames.
oh they know what’s causing it lol
this is a fun followup as the gamers ask Jensen wtf https://www.tomshardware.com/pc-components/gpus/jensen-says-dlss-4-predicts-the-future-to-increase-framerates-without-introducing-latency
Gamers, for all their faults, have been pretty consistently okay on generative AI, at least in the cases I’ve seen. It doesn’t hurt that nVidia keeps stapling features like this into hardware that supposedly improves performance but at the cost of breaking things and/or requiring more work from devs that are already being run ragged.
Also, I can almost guarantee that the neural texture stuff they’re talking about won’t see enough use from developers to actually see improvements. Let’s do a bunch more work to maybe get some memory savings on some of the highest-end hardware!
That sadly has not been my experience. I recall people on some gaming reddits being all ‘you cannot stop the coming tide’ re AI. And also recently got an indie game after nobody mentioned that the dev was using AI art for it (esp the recent updates, it also really sucks the art I mean) in the thread about it. But yes some of them are good on it, not a single entity after all
Name and shame? Which game?
Erannorth Chronicles the problem is esp the last two dlc, which suddenly have graphics of the cards who don’t match. I tried to upload a pretty glaring example but it didn’t work atm. But you can also see it at the main page of the game, compare the page with all the cards on it. With this image, quite sad, esp as soon as you start to notice it (and how often the AI art doesn’t make sense) it just pulls you out of the game.
Ah damn, I have it and enjoyed it a lot - it’s fun! The card assets have always been obviously generic, store-bought assets… but at least they were still made by humans. I remember the first AI controversy in that game, I think it was centered around the main menu’s background image, where the castle turrets and towers kinda didn’t line up.
I don’t know how I feel about this, on one hand this is such an incredibly niche product that I can understand wanting to save every scrap of money, but it sets a bad precedent going forward.
About that bad precedent, the next game. And yes the game is fun, in a niche way. It is sad the ai art draws everything down. It always has a weird tonal mismatch with the rest and the names of the cards, and it is wet.
I appreciate tools like FSR and DLSS but wish they were a value proposition for those of us who can’t afford to buy a Xty ninenty every 3 years, as opposed to being pretty much mandatory to prop up the useability of increasingly poorly optimized games.
oh dont’ worry - they prop up the appearance not the usability. They look smoother but user input is still sampled at the slower rate, so the games aren’t actually any more reactive.
I guess in most games where millisecond reactions are necessary you probably aren’t doing much scenery gazing in the first place and can switch DLSS off without missing much (but you’ll have to pay extra for it anyway).
The long term problem is that no doubt eventually the 30fps of shitty unoptimized gameplay should be enough for everyone rhetoric will move on to you will take 180fps that feel like 30 and like it.
You’re thinking of frame gen, FSR and DLSS do produce more real frames which can make some games useable when otherwise not (or useable at slightly prettier settings)