this post was submitted on 31 Jan 2025
324 points (95.0% liked)

Open Source

32337 readers
1121 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
 

Article: https://proton.me/blog/deepseek

Calls it "Deepsneak", failing to make it clear that the reason people love Deepseek is that you can download and it run it securely on any of your own private devices or servers - unlike most of the competing SOTA AIs.

I can't speak for Proton, but the last couple weeks are showing some very clear biases coming out.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] cygnus@lemmy.ca 157 points 1 day ago (1 children)

Pretty rich coming from Proton, who shoved a LLM into their mail client mere months ago.

[–] harsh3466@lemmy.ml 35 points 1 day ago (1 children)

wait, what? How did I miss that? I use protonmail, and I didn't see anything about an LLM in the mail client. Nor have I noticed it when I check my mail. Where/how do I find and disable that shit?

[–] cygnus@lemmy.ca 49 points 1 day ago (1 children)
[–] harsh3466@lemmy.ml 51 points 1 day ago (1 children)

Thank you. I've saved the link and will be disabling it next time I log in. Can't fucking escape this AI/LLM bullshit anywhere.

[–] cygnus@lemmy.ca 68 points 1 day ago (5 children)

The combination of AI, crypto wallet and CEO's pro-MAGA comments (all within six months or so!) are why I quit Proton. They've completely lost the plot. I just want a reliable email service and file storage.

[–] h6pw5@sh.itjust.works 4 points 19 hours ago

Crypto and AI focus was a weird step before all this came out. But now we know Andy is pro republican… completes a very unappealing picture. We should have a database tho, plenty of c level execs and investor groups do far worse and get no scrutiny simply because they don’t post about it on the internet.

[–] harsh3466@lemmy.ml 18 points 1 day ago (5 children)

I'm considering leaving proton too. The two things I really care about are simplelogin and the VPN with port forwarding. As far as I understand it, proton is about the last VPN option you can trust with port forwarding

[–] fenndev@leminal.space 14 points 1 day ago (8 children)

Happily using AirVPN for port forwarding.

load more comments (8 replies)
load more comments (4 replies)
[–] kboy101222@sh.itjust.works 9 points 1 day ago

Once all that crap came out, I felt incredibly justified by never having switched to Proton.

It was entirely out of laziness, but still

load more comments (2 replies)
[–] Rogue 70 points 1 day ago (4 children)

How apt, just yesterday I put together an evidenced summary of the CEOs recent absurd comments. Why are Proton so keen to throw away so much good will people had invested in them?!


This is what the CEO posting as u/Proton_Team stated in a response on r/ProtonMail:

Here is our official response, also available on the Mastodon post in the screenshot:

Corporate capture of Dems is real. In 2022, we campaigned extensively in the US for anti-trust legislation.

Two bills were ready, with bipartisan support. Chuck Schumer (who coincidently has two daughters working as big tech lobbyists) refused to bring the bills for a vote.

At a 2024 event covering antitrust remedies, out of all the invited senators, just a single one showed up - JD Vance.

By working on the front lines of many policy issues, we have seen the shift between Dems and Republicans over the past decade first hand.

Dems had a choice between the progressive wing (Bernie Sanders, etc), versus corporate Dems, but in the end money won and constituents lost.

Until corporate Dems are thrown out, the reality is that Republicans remain more likely to tackle Big Tech abuses.

Source: https://archive.ph/quYyb

To call out the important bits:

  1. He refers to it as the "official response"
  2. Indicates that JD Vance is on their side just because he attended an event that other invited senators didn't
  3. Rattles on about "corporate Dems" with incredible bias
  4. States "Republicans remain more likely to tackle Big Tech abuses" which is immediately refuted by every response

That was posted in ther/ProtonMail sub where the majority of the event took place: https://old.reddit.com/r/ProtonMail/comments/1i1zjgn/so_that_happened/m7ahrlm/

However be aware that the CEO posting as u/Proton_Team kept editing his comments so I wouldn't trust the current state of it. Plus the proton team/subreddit mods deleted a ton of discussion they didn't like. Therefore this archive link captured the day after might show more but not all: https://web.archive.org/web/20250116060727/https://old.reddit.com/r/ProtonMail/comments/1i1zjgn/so_that_happened/m7ahrlm/

Some statements were made on Mastodon but these are subsequently deleted, but they're capture by an archive link: https://web.archive.org/web/20250115165213/https://mastodon.social/@protonprivacy/113833073219145503

I learned about it from an r/privacy thread but true to their reputation the mods there also went on a deletion spree and removed the entire post: https://www.reddit.com/r/privacy/comments/1i210jg/protonmail_supporting_the_party_that_killed/

This archive link might show more but I've not checked: https://web.archive.org/web/20250115193443/https://old.reddit.com/r/privacy/comments/1i210jg/protonmail_supporting_the_party_that_killed/

There's also this lemmy discussion from the day after but by that point the Proton team had fully kicked in their censorship so I don't know how much people were aware of (apologies I don't know how to make a generic lemmy link) https://feddit.uk/post/22741653

load more comments (4 replies)
[–] simple@lemm.ee 112 points 1 day ago (11 children)

DeepSeek is open source, meaning you can modify code(new window) on your own app to create an independent — and more secure — version. This has led some to hope that a more privacy-friendly version of DeepSeek could be developed. However, using DeepSeek in its current form — as it exists today, hosted in China — comes with serious risks for anyone concerned about their most sensitive, private information.

Any model trained or operated on DeepSeek’s servers is still subject to Chinese data laws, meaning that the Chinese government can demand access at any time.

What???? Whoever wrote this sounds like he has 0 understanding of how it works. There is no "more privacy-friendly version" that could be developed, the models are already out and you can run the entire model 100% locally. That's as privacy-friendly as it gets.

"Any model trained or operated on DeepSeek's servers are still subject to Chinese data laws"

Operated, yes. Trained, no. The model is MIT licensed, China has nothing on you when you run it yourself. I expect better from a company whose whole business is on privacy.

[–] lily33@lemm.ee 32 points 1 day ago (1 children)

To be fair, most people can't actually self-host Deepseek, but there already are other providers offering API access to it.

[–] halcyoncmdr@lemmy.world 30 points 1 day ago (14 children)

There are plenty of step-by-step guides to run Deepseek locally. Hell, someone even had it running on a Raspberry Pi. It seems to be much more efficient than other current alternatives.

That's about as openly available to self host as you can get without a 1-button installer.

[–] Aria@lemmygrad.ml 1 points 10 hours ago* (last edited 10 hours ago) (2 children)

Running R1 locally isn't realistic. But you can rent a server and run it privately on someone else's computer. It costs about 10 per hour to run. You can run it on CPU for a little less. You need about 2TB of RAM.

If you want to run it at home, even quantized in 4 bit, you need 20 4090s. And since you can only have 4 per computer for normal desktop mainboards, that's 5 whole extra computers too, and you need to figure out networking between them. A more realistic setup is probably running it on CPU, with some layers offloaded to 4 GPUs. In that case you'll need 4 4090s and 512GB of system RAM. Absolutely not cheap or what most people have, but technically still within the top top top end of what you might have on your home computer. And remember this is still the dumb 4 bit configuration.

Edit: I double-checked and 512GB of RAM is unrealistic. In fact anything higher than 192 is unrealistic. (High-end) AM5 mainboards support up to 256GB, but 64GB RAM sticks are much more expensive than 48GB ones. Most people will probably opt for 48GB or lower sticks. You need a Threadripper to be able to use 512GB. Very unlikely for your home computer, but maybe it makes sense with something else you do professionally. In which case you might also have 8 RAM slots. And such a person might then think it's reasonable to spend 3000 Euro on RAM. If you spent 15K Euro on your home computer, you might be able to run a reduced version of R1 very slowly.

load more comments (2 replies)
[–] tekato@lemmy.world 15 points 1 day ago (6 children)

You can run an imitation of the DeepSeek R1 model, but not the actual one unless you literally buy a dozen of whatever NVIDIA’s top GPU is at the moment.

[–] lily33@lemm.ee 8 points 1 day ago

A server grade CPU with a lot of RAM and memory bandwidth would work reasonable well, and cost "only" ~$10k rather than 100k+...

load more comments (5 replies)
load more comments (12 replies)
load more comments (10 replies)
[–] pineapple@lemmy.ml 42 points 1 day ago (3 children)

OpenAI, Google, and Meta, for example, can push back against most excessive government demands.

Sure they "can" but do they?

[–] HiddenLayer555@lemmy.ml 28 points 22 hours ago* (last edited 22 hours ago)

Why do that when you can just score a deal with the government to give them whatever information they want for sweet perks like foreign competitors getting banned?

load more comments (2 replies)
[–] thingsiplay@beehaw.org 18 points 23 hours ago (2 children)

How is this Open Source? The official repository https://github.com/deepseek-ai/DeepSeek-R1 contains images only, a PDF file, and links to download the model. I don't see any code. What exactly is Open Source here? And if so, where to get the source code?

[–] JOMusic@lemmy.ml 25 points 22 hours ago

Open-Source in AI usually posted to HuggingFace instead of GitHub: https://huggingface.co/deepseek-ai/DeepSeek-R1

[–] v_krishna@lemmy.ml 22 points 22 hours ago (3 children)

In deep learning generally open source doesn't include actual training or inference code. Rather it means they publish the model weights and parameters (necessary to run it locally/on your own hardware) and publish academic papers explaining how the model was trained. I'm sure Stallman disagrees but from the standpoint of deep learning research DeepSeek definitely qualifies as an "open source model"

[–] thingsiplay@beehaw.org 23 points 22 hours ago (1 children)

Just because they call it Open Source does not make it. DeepSeek is not Open Source, it only provides model weights and parameters, not any source code and training data. I still don't know whats in the model and we only get "binary" data, not any source code. This is not Libre software.

[–] Sal@mander.xyz 16 points 22 hours ago

There is a nice (even if by now already a bit outdated) analysis about the openness of different "open source" generative AI projects in the following article: Liesenfeld, Andreas, and Mark Dingemanse. "Rethinking open source generative AI: open washing and the EU AI Act." The 2024 ACM Conference on Fairness, Accountability, and Transparency. 2024.

load more comments (2 replies)
[–] davel@lemmy.ml 26 points 1 day ago (3 children)
[–] AustralianSimon@lemmy.world 24 points 1 day ago

To be fair its correct but it's poor writing to skip the self hosted component. These articles target the company not the model.

[–] wuphysics87@lemmy.ml 12 points 1 day ago (1 children)

There are many llms you can use offline

[–] davel@lemmy.ml 16 points 1 day ago (1 children)
[–] spooky2092@lemmy.blahaj.zone 7 points 1 day ago

Deepseek works reasonably well, even at cpu only in ollama. I ran the 7b and 1.5b models and it wasn't awful. 7b slowed down as the convo went on, but the 1.5b model felt pretty passable while I was playing with it

load more comments
view more: ‹ prev next ›