Indeed, the whole panic over people not needing chips because AI got more efficient was misguided. All it means is that it’s more accessible now, so more people will be running AI models locally and they’ll buy chips for that. Companies like OpenAI who were trying to make a business model of selling access to their AI as a service are looking to be only the big losers in all this. If this tech gets efficient enough that you can run large models locally, then there are going to be very few cases where people need to use a service, and even when they do, nobody is going to pay high subscription fees now. Interestingly, Apple seems to have bet right because they seem to have anticipated running AI models locally and started targeting their hardware towards that.
Indeed, the whole panic over people not needing chips because AI got more efficient was misguided. All it means is that it’s more accessible now, so more people will be running AI models locally and they’ll buy chips for that. Companies like OpenAI who were trying to make a business model of selling access to their AI as a service are looking to be only the big losers in all this. If this tech gets efficient enough that you can run large models locally, then there are going to be very few cases where people need to use a service, and even when they do, nobody is going to pay high subscription fees now. Interestingly, Apple seems to have bet right because they seem to have anticipated running AI models locally and started targeting their hardware towards that.