I wonder what other applications this might have outside of machine learning. I don’t know if, for example, intensive 3d games absolutely need 16bit floats (or larger), or if it would make sense to try using this “additive implementation” for their floating point multiplicative as well. Modern desktop gaming PCs can easily slurp up to 800W.
Personally, I’d love to learn enough of the latin he spoke to be able to present him with a bottle of Cesar salad dressing and then tell him how many millions of people think of it when they hear his name.