Im using Ollama on my server with the WebUI. It has no GPU so its not quick to reply but not too slow either.

Im thinking about removing the VM as i just dont use it, are there any good uses or integrations into other apps that might convince me to keep it?

    • pflanzenregal@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 months ago

      But they run locally on the phone, if I’m not completely mistaken. So no use for a server.

      Or do they offer this option? I think they explicitly advertise no internet access.