this post was submitted on 31 Jan 2025
253 points (100.0% liked)

Open Source

832 readers
4 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
 

Article: https://proton.me/blog/deepseek

Calls it "Deepsneak", failing to make it clear that the reason people love Deepseek is that you can download and it run it securely on any of your own private devices or servers - unlike most of the competing SOTA AIs.

I can't speak for Proton, but the last couple weeks are showing some very clear biases coming out.

you are viewing a single comment's thread
view the rest of the comments
[–] ReversalHatchery 5 points 3 weeks ago (5 children)

What???? Whoever wrote this sounds like he has 0 understanding of how it works. There is no "more privacy-friendly version" that could be developed, the models are already out and you can run the entire model 100% locally. That's as privacy-friendly as it gets.

Unfortunately it is you who have 0 understanding of it. Read my comment below. Tldr: good luck to have the hardware

[–] simple@lemm.ee 17 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

I understand it well. It's still relevant to mention that you can run the distilled models on consumer hardware if you really care about privacy. 8GB+ VRAM isn't crazy, especially if you have a ton of unified memory on macbooks or some Windows laptops releasing this year that have 64+GB unified memory. There are also websites re-hosting various versions of Deepseek like Huggingface hosting the 32B model which is good enough for most people.

Instead, the article is written like there is literally no way to use Deepseek privately, which is literally wrong.

[–] superglue@lemmy.dbzer0.com 3 points 3 weeks ago

So I've been interested in running one locally but honestly I'm pretty confused what model I should be using. I have a laptop with a 3070 mobile in it. What model should I be going after?

[–] ReversalHatchery 1 points 2 weeks ago

as I said in my original comment, it's not only VRAM that matters.

I honestly doubt that even gamer laptops can run these models with a usable speed, but even if we add up the people who have such a laptop, and those who have a PC powerful enough to run these models, they are tiny fractions of those that use the internet on the world. it is basically not available to those that want to use it. ot is available to some of them, but not nearly all who may want it

[–] thingsiplay 5 points 3 weeks ago (1 children)

Is it Open Source? I cannot find the source code. The official repository https://github.com/deepseek-ai/DeepSeek-R1 only contains images, a PDF file, and links to download the model. But I don't see any code. What exactly is Open Source here?

[–] deathbird@mander.xyz 5 points 3 weeks ago (1 children)

I don't see the source either. Fair cop.

[–] thingsiplay 5 points 3 weeks ago (1 children)

Thanks for confirmation. I made a top level comment too, because this important information gets lost in the comment hierarchy here.

[–] Hotzilla@sopuli.xyz 2 points 2 weeks ago* (last edited 2 weeks ago)

Open source is in general wrong term in all of these "open source" LLM's (like LLAMA and R1), the model is shared, but there is no real way of reproducing the model. This is because the training data is never shared.

In my mind open source means that you can reproduce the same binary from source. The models are shared for free, but not "open".

[–] azron@lemmy.ml 3 points 3 weeks ago* (last edited 3 weeks ago)

Down votes be damned, you are right to call out the parent they clearly dont articulate their point in a way that confirms they actually understand what is going on and how an open source model can still have privacy implications if the masses use the company's hosted version.

[–] v_krishna@lemmy.ml 3 points 3 weeks ago (1 children)

Obviously you need lots of GPUs to run large deep learning models. I don't see how that's a fault of the developers and researchers, it's just a fact of this technology.

[–] ReversalHatchery 1 points 2 weeks ago

and that is not what I was complaining about

[–] lily33@lemm.ee 2 points 3 weeks ago (1 children)

There are already other providers like Deepinfra offering DeepSeek. So while the the average person (like me) couldn't run it themselves, they do have alternative options.

[–] ReversalHatchery 1 points 2 weeks ago

which probably also collects and keeps everything you say in the chat. just look in ublock origin's expanded view to see their approach to privacy, by having a look at all the shit they are pushing to your browser