Lee Duna@lemmy.nz to Technology@lemmy.worldEnglish · 10 months agoSomeone made a GPT-like chatbot that runs locally on Raspberry Pi, and you can toowww.xda-developers.comexternal-linkmessage-square30fedilinkarrow-up1293arrow-down115
arrow-up1278arrow-down1external-linkSomeone made a GPT-like chatbot that runs locally on Raspberry Pi, and you can toowww.xda-developers.comLee Duna@lemmy.nz to Technology@lemmy.worldEnglish · 10 months agomessage-square30fedilink
minus-squareQuadratureSurfer@lemmy.worldlinkfedilinkEnglisharrow-up15·10 months agoI’ve got it running with a 3090 and 32GB of RAM. There are some models that let you run with hybrid system RAM and VRAM (it will just be slower than running it exclusively with VRAM).
minus-squareDeceptichum@kbin.sociallinkfedilinkarrow-up16·10 months agoYeah but damn does it get slow. I always find it interesting how text is so much slower than image generation. I can do a 1024x1024 in probably 20s, but I get like 1 word a second with text.
minus-squareferret@sh.itjust.workslinkfedilinkEnglisharrow-up5·10 months agoLanguages are complex and, more importantly, much less forgiving to error
I’ve got it running with a 3090 and 32GB of RAM.
There are some models that let you run with hybrid system RAM and VRAM (it will just be slower than running it exclusively with VRAM).
Yeah but damn does it get slow.
I always find it interesting how text is so much slower than image generation. I can do a 1024x1024 in probably 20s, but I get like 1 word a second with text.
Languages are complex and, more importantly, much less forgiving to error