This is not credible. A self promoting stock pump and dump PR. Vision AI models are smaller than text models. They do need fast/faster GPUs, but less memory. Very narrow purposed AI/Neural Network models need less memory because the memory is more about storing facts than logic/reasoning capability. LLM breakthroughs in benchmark score/GB are currently having more gains by smaller models than frontier largest models. 32gb is a reasonable ceiling for memory requirement. Robots can swap in task specific AI models as well.
This is not credible. A self promoting stock pump and dump PR. Vision AI models are smaller than text models. They do need fast/faster GPUs, but less memory. Very narrow purposed AI/Neural Network models need less memory because the memory is more about storing facts than logic/reasoning capability. LLM breakthroughs in benchmark score/GB are currently having more gains by smaller models than frontier largest models. 32gb is a reasonable ceiling for memory requirement. Robots can swap in task specific AI models as well.