this post was submitted on 08 Jan 2025
86 points (97.8% liked)

Privacy

32654 readers
539 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

What are your thoughts on #privacy and #itsecurity regarding the #LocalLLMs you use? They seem to be an alternative to ChatGPT, MS Copilot etc. which basically are creepy privacy black boxes. How can you be sure that local LLMs do not A) "phone home" or B) create a profile on you, C) that their analysis is restricted to the scope of your terminal? As far as I can see #ollama and #lmstudio do not provide privacy statements.

you are viewing a single comment's thread
view the rest of the comments
[–] DarkDarkHouse@lemmy.sdf.org 26 points 2 days ago (11 children)

I run Ollama with Open WebUI at home.

A) the containers they run in by default can’t access the Internet, but they are provided access if we turn on web search or want to download new models. Ollama and Open WebUI are fairly popular products and I haven’t seen any evidence of nefarious activity so far.

B) they create a profile on me and my family members that use them, by design. We can add sensitive documents that the models can use.

C) they are restricted by what we type and the documents we provide.

[–] Bz1sen@lemmy.world 1 points 1 day ago

How fast are response times and how useful are the answers of these open source models that you can run on a low end GPU? I know this will be a "depends" answer, but maybe you can share more of your experience. I often use Claude sonnet newest model and for my use cases it is a real efficiency boost if used right. I once mid of last year tested briefly an open source model from meta and it just wasn't it. Or do we rather have to conclude that we'll have to wait for another year until smaller open source models are more proficient?

load more comments (10 replies)