Im trying to figure out how online search funkcion works… Didnt have much luck for now. And also general discusion about the app would be wery helpfull for eweryone.
I use it also some times but that is online ai and everything you use it for goes to someones servers so its not private. Alpaca is using ollama to run ai localy on your machine so everything you use it for is private. So those two are completly diferent and are not for comparing which one is easier to use because ease of use is not the point here. Privacy is.
Taking advantage of the fact that this thread became popular, question to all of you guys: do you recommend some other open source LLM front ends?
LM Studio is by far my favorite. Supports all GPUs out of the box on Linux and has tons of options.
LM studio is not open source at all.
Looks like you’re right.
I switched to it when Alpaca stopped working on AMD GPUs and was under the impression it is open source.
So far i’ve really liked just using ollama in the terminal since it just spits out text anyway.
ofc I could even send raw api requests, but sometimes it’s good to have a nice GUI that “just works”.
Specifically I’m looking for something that could handle not only text responses, but also attachments, speech recognition and MCP support.
Yeah in that case you probably want something else. So far i’ve only ever used it for text based questions. I think i remember seeing that there is also a webui out there but i don’t remember the name.
I have made good experiences with GPT4ALL
I used alpaca but they made some changes recently that made it confusing and a pain to use. I deleted it after that as i dont use ai much anyway.
There are still active accounts on lemm.ee?
I am not certain what you mean by online search function. It can connect to the internet but it doesn’t exactly function like a search engine from what I can understand.
Second opcion looks like exactly that…
https://github.com/Jeffser/Alpaca
This will probably help anyone unfamiliar with it, since the first search result for Alpaca AI is another online paid AI service which does something entirely different than this. It’s used for AI image generation.
The main question I have is since Ollama is optional… If you optionally use it, is it still sharing data with
FacebookMeta?Didnt know that ollama is sharing data with facebook… Why would it do something like that? Wouldnt that be oposite of what it was created for and that is privacy… Where did you get that info?
It looked like from comments that’s why he made the Ollama integration optional, because some people were concerned since Ollama was built by Meta. It can run without Ollama, it seems.
EDIT: Doing more research on Ollama itself, I’m unconvinced that it’s sharing any data, despite being built by Meta.
I didnt know that ollama was built by meta, where did you find that out? Its also an open source project it shouldnt have malicios code like that…
Meta trained and published the model but it’s an open model. I’m not an expert but I don’t believe it’s sharing data with Meta since it’s just the model they trained, you can download it and run it offline. You’re just using the output of all the training they did on your own compute.
You’re talking about the llama models, not ollama.
So it doesnt have anything with ollama softvare, you can download any llm it doesnt have to be metas…