NextElephant9@awful.systemstoTechTakes@awful.systems•Stubsack: weekly thread for sneers not worth an entire post, week ending 15th December 2024English
9·
2 months agoOpenAI whistleblower found dead in San Francisco apartment.
Thread on r/technology.
edited to add:
From his personal website: When does generative AI qualify for fair use?
Knowledge distilation is training a smaller model to mimic the outputs of a larger model. You don’t need to use the same training set that was used to train the larger model (the whole internet or whatever they used for chatgpt), but can use a transfer set.
Here’s a reference: Hinton, Geoffrey. “Distilling the Knowledge in a Neural Network.” arXiv preprint arXiv:1503.02531 (2015)., https://arxiv.org/pdf/1503.02531