joojmachine@lemmy.ml to Linux@lemmy.mlEnglish · 6 months agoAlpaca: an ollama client to easily interact with an LLM locally or remotelyflathub.orgexternal-linkmessage-square13fedilinkarrow-up1126arrow-down17cross-posted to: [email protected]
arrow-up1119arrow-down1external-linkAlpaca: an ollama client to easily interact with an LLM locally or remotelyflathub.orgjoojmachine@lemmy.ml to Linux@lemmy.mlEnglish · 6 months agomessage-square13fedilinkcross-posted to: [email protected]
minus-squarePossibly linux@lemmy.ziplinkfedilinkEnglisharrow-up4·edit-26 months agoThis is what I needed. I will run this locally and run ollama in a VM. Although podman and Distrobox look tempting.
This is what I needed. I will run this locally and run ollama in a VM.
Although podman and Distrobox look tempting.