Im trying to figure out how online search funkcion works… Didnt have much luck for now. And also general discusion about the app would be wery helpfull for eweryone.
Taking advantage of the fact that this thread became popular, question to all of you guys: do you recommend some other open source LLM front ends?
I was using LibreChat for a while
I have made good experiences with GPT4ALL
So far i’ve really liked just using ollama in the terminal since it just spits out text anyway.
ofc I could even send raw api requests, but sometimes it’s good to have a nice GUI that “just works”.
Specifically I’m looking for something that could handle not only text responses, but also attachments, speech recognition and MCP support.
Yeah in that case you probably want something else. So far i’ve only ever used it for text based questions. I think i remember seeing that there is also a webui out there but i don’t remember the name.
I used alpaca but they made some changes recently that made it confusing and a pain to use. I deleted it after that as i dont use ai much anyway.
https://github.com/Jeffser/Alpaca
This will probably help anyone unfamiliar with it, since the first search result for Alpaca AI is another online paid AI service which does something entirely different than this. It’s used for AI image generation.
The main question I have is since Ollama is optional… If you optionally use it, is it still sharing data with
FacebookMeta?Didnt know that ollama is sharing data with facebook… Why would it do something like that? Wouldnt that be oposite of what it was created for and that is privacy… Where did you get that info?
It looked like from comments that’s why he made the Ollama integration optional, because some people were concerned since Ollama was built by Meta. It can run without Ollama, it seems.
EDIT: Doing more research on Ollama itself, I’m unconvinced that it’s sharing any data, despite being built by Meta.
I didnt know that ollama was built by meta, where did you find that out? Its also an open source project it shouldnt have malicios code like that…
Meta trained and published the model but it’s an open model. I’m not an expert but I don’t believe it’s sharing data with Meta since it’s just the model they trained, you can download it and run it offline. You’re just using the output of all the training they did on your own compute.
You’re talking about the llama models, not ollama.
So it doesnt have anything with ollama softvare, you can download any llm it doesnt have to be metas…