voidx@futurology.todayM to Futurology@futurology.todayEnglish · 8 months agoAI Companies Running Out of Training Data After Burning Through Entire Internetfuturism.comexternal-linkmessage-square52fedilinkarrow-up1193arrow-down14cross-posted to: [email protected][email protected][email protected][email protected]
arrow-up1189arrow-down1external-linkAI Companies Running Out of Training Data After Burning Through Entire Internetfuturism.comvoidx@futurology.todayM to Futurology@futurology.todayEnglish · 8 months agomessage-square52fedilinkcross-posted to: [email protected][email protected][email protected][email protected]
minus-squareCanadaPlus@lemmy.sdf.orglinkfedilinkEnglisharrow-up4·edit-28 months agoWell, it’s established wisdom that the dataset size needs to scale with the number of model parameters. Quadratically, IIRC. If you don’t have that much data the training basically won’t work; it will overfit or just not progress.
Well, it’s established wisdom that the dataset size needs to scale with the number of model parameters. Quadratically, IIRC. If you don’t have that much data the training basically won’t work; it will overfit or just not progress.