Indeed, the whole panic over people not needing chips because AI got more efficient was misguided. All it means is that it’s more accessible now, so more people will be running AI models locally and they’ll buy chips for that. Companies like OpenAI who were trying to make a business model of selling access to their AI as a service are looking to be only the big losers in all this. If this tech gets efficient enough that you can run large models locally, then there are going to be very few cases where people need to use a service, and even when they do, nobody is going to pay high subscription fees now. Interestingly, Apple seems to have bet right because they seem to have anticipated running AI models locally and started targeting their hardware towards that.
the genie is out of the bottle, they might as well capitalize on it while they still can.
i predict with ai getting cheaper and more efficient we will “soon” have the ability to run our own personal ai locally on a phone
nvidia will be less happy by then, but they care about short term gains like every other large Corp
Indeed, the whole panic over people not needing chips because AI got more efficient was misguided. All it means is that it’s more accessible now, so more people will be running AI models locally and they’ll buy chips for that. Companies like OpenAI who were trying to make a business model of selling access to their AI as a service are looking to be only the big losers in all this. If this tech gets efficient enough that you can run large models locally, then there are going to be very few cases where people need to use a service, and even when they do, nobody is going to pay high subscription fees now. Interestingly, Apple seems to have bet right because they seem to have anticipated running AI models locally and started targeting their hardware towards that.