this post was submitted on 20 Feb 2024
83 points (100.0% liked)

Privacy

795 readers
57 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
top 12 comments
sorted by: hot top controversial new old
[–] ExtremeDullard@lemmy.sdf.org 36 points 11 months ago

And this is a surprise how?

The entire digital economy is based on spying. It's called corporate surveillance and it's been around for 25 years. Why would AI escape this business model? If anything, it turbocharges it.

[–] DmMacniel@feddit.de 9 points 11 months ago (1 children)

I'm shooked I say. shooked!

[–] EveryMuffinIsNowEncrypted@lemmy.blahaj.zone 3 points 11 months ago (1 children)
[–] DmMacniel@feddit.de 3 points 11 months ago* (last edited 11 months ago) (1 children)
[–] EveryMuffinIsNowEncrypted@lemmy.blahaj.zone 4 points 11 months ago* (last edited 11 months ago)

That was a Futurama reference I was continuing that I thought you were making. Lol.

[–] const_void@lemmy.ml 8 points 11 months ago

Given how hard they've been pushing Copilot/Bing Chat/etc I'm not surprised

[–] sub_ubi@lemmy.ml 4 points 11 months ago (2 children)

As a bad Python scripter, I'm stuck using Microsoft's AI because there isn't a privacy-focused alternative anywhere near as good.

[–] vfosnar 4 points 11 months ago

Don't overuse AI, there is plenty of resources on the web and at least you can practice reading docs. Use Phind. https://www.phind.com/privacy

[–] swordsmanluke@programming.dev 2 points 11 months ago (1 children)

It's not as good, but running small LLMs locally can work. I've been messing around with ollama, which makes it drop dead simple to try out different models locally.

You won't be running any model as powerful as ChatGPT - but for quick "stack overflow replacement" style of questions I find it's usually good enough.

And before you write off the idea of local models completely, some recent studies indicate that our current models could be made orders of magnitude smaller for the same level of capability. Think Moore's law but for shrinking the required connections within a model. I do believe we'll be able to run GPT3.5-level models on consumer grade hardware in the very near future. (Of course, by then GPT-7 may be running the world but we live in hope).

[–] Facebones@reddthat.com 1 points 11 months ago
[–] grandel@lemmy.ml 1 points 11 months ago

Isn't that their business model? How else can windows be offered for "free"?