this post was submitted on 15 Apr 2024
262 points (100.0% liked)

Solarpunk

224 readers
3 users here now

The space to discuss Solarpunk itself and Solarpunk related stuff that doesn't fit elsewhere.

What is Solarpunk?

Join our chat: Movim or XMPP client.

founded 2 years ago
MODERATORS
 

I found that idea interesting. Will we consider it the norm in the future to have a "firewall" layer between news and ourselves?

I once wrote a short story where the protagonist was receiving news of the death of a friend but it was intercepted by its AI assistant that said "when you will have time, there is an emotional news that does not require urgent action that you will need to digest". I feel it could become the norm.

EDIT: For context, Karpathy is a very famous deep learning researcher who just came back from a 2-weeks break from internet. I think he does not talks about politics there but it applies quite a bit.

EDIT2: I find it interesting that many reactions here are (IMO) missing the point. This is not about shielding one from information that one may be uncomfortable with but with tweets especially designed to elicit reactions, which is kind of becoming a plague on twitter due to their new incentives. It is to make the difference between presenting news in a neutral way and as "incredibly atrocious crime done to CHILDREN and you are a monster for not caring!". The second one does feel a lot like exploit of emotional backdoors in my opinion.

top 28 comments
sorted by: hot top controversial new old
[–] cro_magnon_gilf@sopuli.xyz 29 points 7 months ago (1 children)

That's why I stick with platforms where hardline communist teenagers can curate what I'm exposed to.

[–] keepthepace@slrpnk.net 6 points 7 months ago

That's the only way.

[–] MonkderDritte@feddit.de 16 points 7 months ago* (last edited 7 months ago) (1 children)

Our mind is built on that "malware". I think it's more accurate to compare brain + knowledge to our immune system: the more samples you have, the better you are armed against mal-information.

[–] keepthepace@slrpnk.net 2 points 7 months ago

This sounds like the theories that were more prevalent before germ theory. Surgeons or obstetricians would argue that washing hands was a disservice to the organisms they get into.

Immune systems still get sick and can be overwhelmed. There is a mental hygiene that needs to exist.

[–] Lemvi@lemmy.sdf.org 13 points 7 months ago* (last edited 7 months ago) (2 children)

I think the right approach would be to learn to deal with any kind of information, rather than to censor anything we might not like hearing.

[–] justinbieber@lemmy.dbzer0.com 2 points 7 months ago

yea, this should be the right approach, but how do you actually manage the topic?

[–] keepthepace@slrpnk.net 2 points 7 months ago (1 children)

We are already having tons of filters in place trying to serve us information we are interested in, knowledgeable enough to digest, not-spammy, in the correct language, not porn or gore, etc... He is just proposing another interesting dimension. For instance, I am following AI news and news about the Ukraine conflict but I prefer to keep them separate and to not be distracted by the other when I get my fill on one.

The only way I found with Twitter (and now Mastodon) to do it is to devote twitter only to tech news.

[–] Lemvi@lemmy.sdf.org 1 points 7 months ago (1 children)

I don't think he is proposing another dimension, but rather another scale. As you already said, we already filter the information that reaches us.

He seems to take this idea of filtering/censorship to an extreme. Where I see filtering mostly as a matter of convenience, he portrays information as a threat that people need to be protected from. He implies that being presented with information that challenges your world view is something bad, and I disagree with that.

I am not saying that filtering is bad. I too have blocked some communities here on Lemmy. I am saying that it is important not to put yourself in a bubble, where every opinion you see is one you agree with, and every news article confirms your beliefs.

[–] keepthepace@slrpnk.net 1 points 7 months ago

Emotion != information

You can know that the Israeli-Palestinian conflict is going on without having to put pictures of maimed bodies inside your news feed. Actually I have blocked people I actually agree with just because they could not stop spamming angrily about it. I have also a militant ecologist friend who thinks saving the planet implies pushing the most anxiety inducing news as much as possible. Blocked.

I don't think that blocking the content that focus on pathos locks us up in a bubble, that's quite the opposite. Emotions block analysis.

Not really. An executable controlled by an attacker could likely "own" you. A toot tweet or comment can not, it's just an idea or thought that you can accept or reject.

We already distance ourselves from sources of always bad ideas. For example, we're all here instead of on truth social.

[–] Whorehoarder@lemmynsfw.com 7 points 7 months ago (1 children)

Reminds me of Snow Crash by Nealyboi

[–] perestroika@slrpnk.net 6 points 7 months ago* (last edited 7 months ago)

I think most people already have this firewall installed, and it's working too well - they're absorbing minimal information that contradicts their self-image or world view. :) Scammers just know how to bypass the firewall. :)

[–] theneverfox@pawb.social 5 points 7 months ago

I remember watching a video from a psychiatrist with eastern Monk training. He was explaining about why yogis spend decades meditating in remote caves - he said it was to control information/stimuli exposure.

Ideas are like seeds, once they take root they grow. You can weed out unwanted ones, but it takes time and mental energy. It pulls at your attention and keeps you from functioning at your best

The concept really spoke to me. It's easier to consciously control your environment than it is to consciously control your thoughts and emotions.

[–] DaseinPickle@leminal.space 4 points 7 months ago (1 children)

I mean, this is just called censorship. We censor things for kids and all kind of people in or lives all the time. We censor things for ourselves when we don’t feel like reading the news or opening a text from a specific person. This is not some novel concept.

[–] keepthepace@slrpnk.net 5 points 7 months ago

Not really. This is user-controlled filtering. Censorship is done to push a specific worldview to victims. Filtering we do it all the time for spam for instance.

[–] x_cell@slrpnk.net 4 points 7 months ago

In a way, the job of a teacher or journalist is to filter useful and/or relevant information for interested parties.

[–] gayhitler420@lemm.ee 4 points 7 months ago

you already have that firewall. it's your experiences and human connections, your understanding of media, your personal history and learning and the feelings you experience.

you don't need a firewall to keep you from being manipulated, you need to learn to fucking read and think and feel. to learn and question, to develop trusted friends and family you can talk to.

if it feels like your emotional backdoors are being exploited then maybe youre thinking or behaving like a monster and your mind is revolting against itself.

[–] Fizz@lemmy.nz 4 points 7 months ago (1 children)

We already have a firewall its our thoughts. The information can nudge us but it's fighting an uphill battle against everything we already know and believe.

[–] keepthepace@slrpnk.net 1 points 7 months ago

Your thoughts (I guess you mean your past knowledge, intelligence and critical thinking) allows you to dismiss lies, but it does not shield you from the emotional charge of some news.

[–] ondoyant 3 points 7 months ago

i have a general distaste for the mind/computer analogy. no, tweets aren't like malware, because language isn't like code. our brains were not shaped by the same forces that computers are, they aren't directly comparable structures that we can transpose risks onto. computer scientists don't have special insight into how human societies work because they understand linear algebra and network theory, in the same way that psychologists and neurologists don't have special insight into machine learning because they know how the various regions of the human brain interact to form a coherent individual mind, or the neural circuits that go into sensory processing.

i personally think that trying to solve social problems with technological solutions is folly. computers, their systems, the decisions they make, are not by nature less vulnerable to bias than we are. in fact, the kind of math that governs automated curation algorithms happens to be pretty good at reproducing and amplifying existing social biases. relying on automated systems to do the work of curation for us isn't some kind of solution to the problems that exist on twitter and elsewhere, it is explicitly part of the problem.

twitter isn't giving you "direct, untrusted" information. its giving you information served by a curation algorithm designed to maximize whatever it is twitter's programmers have built, and those programmers might not even be accurately identifying what it is that they're maximizing for. assuming that we can make a "firewall" that maximizes for neutrality or objectivity is, to my mind, no less problematic than the systems that already exist, because it makes the same assumption: that we can build computational systems that reliably and robustly curate human social networks in ways that are provably beneficial, "neutral", or unbiased. that just isn't a power that computers have, nor is it something we should want as beings with agency and autonomy. people should have control over how their social networks function, and that control does not come from outsourcing social decisions to black-boxed machine learning algorithms controlled by corporate interests.

[–] rickyrigatoni@lemm.ee 3 points 7 months ago

Do we have an iamverysmart community yet?

[–] xxd@discuss.tchncs.de 3 points 7 months ago

Leaving aside the dystopian echo chamber that this could result in, you could argue that this would help with fake news by a lot. Fake news are so easy to spread and more present than ever. And for every person there is probably that one piece of news that is just believable enough to not question it. And then the next just believable piece of news. and another. I believe no one is immune to being influenced by fake stories, maybe even radicalized if they are targeted just right. A firewall just filtering out everything non-factual would already prevent so much societal damage I think.

[–] bloodfart@lemmy.ml 3 points 7 months ago

We already have a firewall layer between outside information and ourselves, it’s called the ego, superego, our morals, ethics and comprehension of our membership in groups, our existing views and values. The sum of our experiences up till now!

Lay off the Stephenson and Gibson. Try some Tolstoy or Steinbeck.

[–] uriel238@lemmy.blahaj.zone 2 points 7 months ago

I look forward to factchecker services that interface right into the browser or OS, and immediately recognize and flag comments that might be false or misleading. Some may provide links to deep dives where it's complicated and you might want to know more.

[–] Kolanaki@yiffit.net 2 points 7 months ago

I've thought about this since seeing Ghost in the Shell as a kid. If direct neural interfaces become common place, the threat of hacking opens up from simply stealing financial information or material for blackmail; they may be able to control your entire body!

[–] Desmond373@slrpnk.net 2 points 7 months ago

People are thinking of the firewall here as something external. You can do this without outside help.

Who is this source. Why are they telling me this. How do they know this. What infomation might they be ommiting.

From that point you have enough infomation to make a judgement for yourself what a point of infomation is.

[–] intensely_human@lemm.ee 1 points 7 months ago

Having a will means choosing what to do. Denying the existence of a person’s will is dehumanizing.