wuphysics87@lemmy.ml to Privacy@lemmy.ml · 11 hours agoCan you trust locally run LLMs?message-squaremessage-square15fedilinkarrow-up143arrow-down15file-text
arrow-up138arrow-down1message-squareCan you trust locally run LLMs?wuphysics87@lemmy.ml to Privacy@lemmy.ml · 11 hours agomessage-square15fedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-squareDagamant@lemmy.worldlinkfedilinkarrow-up5·3 hours agoLlama does have some details on GitHub about this. According to them, no, it does not send data anywhere. You can also monitor network traffic to watch for it sending anything it shouldn’t. If you don’t want to do that, firewall it.
minus-squareSupraMario@lemmy.worldlinkfedilinkarrow-up6·2 hours agoAnd if you don’t want to do that…run it in a VM and unplug your nic/disconnect wifi
Llama does have some details on GitHub about this. According to them, no, it does not send data anywhere. You can also monitor network traffic to watch for it sending anything it shouldn’t. If you don’t want to do that, firewall it.
And if you don’t want to do that…run it in a VM and unplug your nic/disconnect wifi