this post was submitted on 22 Dec 2023
201 points (100.0% liked)

Technology

37735 readers
45 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] frog 139 points 11 months ago (1 children)

It is true that removing and demonetising Nazi content wouldn't make the problem of Nazis go away. It would just be moved to dark corners of the internet where the majority of people would never find it, and its presence on dodgy-looking websites combined with its absence on major platforms would contribute to a general sense that being a Nazi isn't something that's accepted in wider society. Even without entirely making the problem go away, the problem is substantially reduced when it isn't normalised.

[–] alyaza 86 points 11 months ago* (last edited 11 months ago) (4 children)

the weirdest thing to me is these guys always ignore that banning the freaks worked on Reddit--which is stereotypically the most cringe techno-libertarian platform of the lot--without ruining the right to say goofy shit on the platform. they banned a bunch of the reactionary subs and, spoiler, issues with those communities have been much lessened since that happened while still allowing for people to say patently wild, unpopular shit

[–] frog 39 points 11 months ago

Yep! Reddit is still pretty awful in many respects (and I only even bother with it for specific communities for which I haven't found a suitable active equivalent on Lemmy - more frogs and bugs on Lemmy please), but it did get notably less unpleasant when the majority of the truly terrible subs were banned. So it does make a difference.

I feel like "don't let perfect be the enemy of good" is apt when it comes to reactionaries and fascists. Completely eliminating hateful ideologies would be perfect, but limiting their reach is still good, and saying "removing their content doesn't make the problem go away" makes it sound like any effort to limit the harm they do is rendered meaningless because the outcome is merely good rather than perfect.

[–] jasory@programming.dev 6 points 11 months ago (1 children)

You're literally on a platform that was created to harbor extremist groups. Look at who Dessalines is, (aka u/parentis-shotgun) and their self-proclaimed motivation for writing LemmyNet. When you ban people from a website, they just move to another place, they are not stupid it's pretty easy to create websites. It's purely optical, you're not saving civilisation from harmful ideas, just preventing yourself from seeing it.

[–] alyaza 35 points 11 months ago* (last edited 11 months ago) (7 children)

When you ban people from a website, they just move to another place, they are not stupid it’s pretty easy to create websites. It’s purely optical,

you are literally describing an event that induces the sort of entropy we're talking about here--necessarily when you ban a community of Nazis or something and they have to go somewhere else, not everybody moves to the next place (and those people diffuse back into the general population), which has a deradicalizing effect on them overall because they're not just stewing in a cauldron of other people who reinforce their beliefs

load more comments (7 replies)
[–] jarfil 5 points 11 months ago* (last edited 11 months ago) (2 children)

I'd argue that it still broke Reddit.

Back in the day, I might say something out of tone in some subreddit, get the comment flagged, discuss it with a mod, and either agree to edit it or get it removed. No problem.

Then Reddit started banning reactionary subs, subs started using bots to ban people for even commenting on other blacklisted subs, subs started abusing automod to ban people left and right, even quoting someone to criticize them started counting as using the same "forbidden words", conversations with mods to clear stuff up pretty much disappeared, application of modern ToS retroactively to 10 year old content became a thing... until I got permabanned from the whole site after trying to recur a ban, with zero human interaction. Some months later, while already banned sitewide, they also banned me from some more subs.

Recently Reddit revealed a "hidden karma" feature to let automod pre-moderate potentially disruptive users.

Issues with the communities may have lessened, but there is definitely no longer the ability to say goofy, wild, or unpopular stuff... or in some cases, even to criticize them. There also have been an unknown number of "collateral damage" bans, that Reddit doesn't care about anymore.

[–] alyaza 21 points 11 months ago (1 children)

imo if reddit couldn't survive "purging literally its worst elements, which included some of the most vehement bigotry and abhorrent content outside of 4chan" it probably doesn't deserve to survive

load more comments (1 replies)
[–] circuitsunfish@plesiosaur.net 6 points 11 months ago (6 children)

@jarfil @alyaza i have said plenty of wild stuff and haven't been banned from any subs? None of it has been bigoted tho

load more comments (6 replies)
load more comments (1 replies)
[–] ursakhiin 69 points 11 months ago (1 children)

Not gonna lie. I've never heard of Substack but I appreciate their stance of publicly announcing why I would continue to avoid them.

[–] jmp242@sopuli.xyz 7 points 11 months ago

My only interaction with Substack is that one podcast moved there for premium content. I thought it was mostly for written newsletters, which I always wondered how much of a market there actually is for paying for one newsletter, but then again I guess it's just the written version of podcasts so I guess there is a market. Though promoting Nazi content gives me a lot of pause.

[–] alyaza 60 points 11 months ago (2 children)

techno-libertarianism strikes again! it's every few years with these guys where they have to learn the same lesson over again that letting the worst scum in politics make use of your website will just ensure all the cool people evaporate off your website--and Substack really does not have that many cool people or that good of a reputation to begin with.

[–] Kichae@lemmy.ca 17 points 11 months ago (1 children)

They just really, really love running Nazi bars. They just don't like it when the normies realizes that the neighbourhood bar is a Nazi bar.

[–] alyaza 10 points 11 months ago (2 children)

i go back and forth on how much i think this tendency's willingness to host content like this and/or go to the mat for it is agreement and how much of it is just stupidity or ill-conceived ideology. a lot of these guys seem like they agree with elements of fascism, but a lot of them are also... just not smart.

[–] Kichae@lemmy.ca 8 points 11 months ago (1 children)

The Nazi bar analogy says nothing about agreement. Just that failing to remove Nazis from your bar is a great way to flood your bar with Nazis, because once they know its safe for them to Nazi it up in your establishment, they'll tell their friends about you.

If you don't proactively remove the Nazis, you're creating a Nazi safe space, whether you agree with them or not.

[–] alyaza 3 points 11 months ago* (last edited 11 months ago)

i am familiar with the analogy, but i think it would obviously be worse if they agree with what they're platforming instead of just being kind of half-baked morons who don't have good political positions or cynically platforming it because it makes them money. one can, in effect, be remedied by showing them social or financial retribution, but the other would be a manifestation of a much more serious social problem that cannot be immediately dealt with

load more comments (1 replies)
load more comments (1 replies)
[–] Omega_Haxors@lemmy.ml 47 points 11 months ago* (last edited 11 months ago)

Translation: "We support Nazis and would like to offer them passive protection. If you have a problem with them, we will ban you"

[–] dubteedub 35 points 11 months ago

Any writers still on SubStack need to immeadiately look at alternative options and shift their audiences to other platforms. To stick around on the site when the founder straight up condones neo nazis and not only gives them a platform, but profit shares with them and their nazi subscribers is insane.

[–] maynarkh@feddit.nl 32 points 11 months ago (1 children)

If they plan to do business in the EU this is illegal.

load more comments (1 replies)
[–] some_guy@lemmy.sdf.org 32 points 11 months ago

Reading about this at work the other day, I announced to my coworkers that Substack is officially bad. Profiting off of nazi propaganda is bad. Fuck Substack.

I had recently subscribed to the RSS feed for The Friendly Atheist and was considering monetary support. They accept via Substack or Patreon. I would have opted for Patreon anyway, because that's where I already have subscriptions. But after learning about this, I'll never support anything, no matter what, via Substack. Eat my ass, shitheads.

[–] AaronMaria@lemmy.ml 30 points 11 months ago

What do you mean banning doesn't work? The less reach those Nazis have the less people can see their Nazi-Posts and get turned into Nazis. Also it needs to be clear that being a Nazi is not acceptable so they don't have the courage to spread their hate. This bullshit needs to stop.

[–] sculd 21 points 11 months ago

Nope, never supporting anything from substacks again. "Freeze peach" libertarians can go to hell.

[–] cupcakezealot@lemmy.blahaj.zone 16 points 11 months ago* (last edited 11 months ago) (1 children)

if you say nazi and white supremacist content is just a "different point of view", you support nazi and white supremacist content. period.

and it's not surprising since lulu meservey's post on twitter when the whole situation with elon basically abandoning moderation.

"Substack is hiring! If you’re a Twitter employee who’s considering resigning because you’re worried about Elon Musk pushing for less regulated speech… please do not come work here."

https://www.inverse.com/input/culture/substack-hiring-elon-musk-tweet

[–] Drewski@lemmy.sdf.org 4 points 11 months ago (1 children)

The problem is that some people are quick to call things Nazi and white supremacist, when it's actually just something they disagree with.

[–] Powerpoint@lemmy.ca 10 points 11 months ago

That's not the problem at all. If you support fascists then you support Nazi's and white supremacy.

[–] janguv@lemmy.dbzer0.com 16 points 11 months ago (5 children)

There's a lot of empirical claims surrounding this topic, and I'm unaware who really has good evidence for them. The Substack guy e.g. is claiming that banning or demonetising would not "solve the problem" – how do we really know? At the very least, you'd think that demonetising helps to some extent, because if it's not profitable to spread certain racist ideas, that's simply less of an incentive. On the other hand, plenty of people on this thread are suggesting it does help address the problem, pointing to Reddit and other cases – but I don't think anyone really has a grip on the empirical relationship between banning/demonetising, shifting ideologues to darker corners of the internet and what impact their ideas ultimately have. And you'd think the relationship wouldn't be straightforward either – there might be some general patterns but it could vary according to so many contingent and contextual factors.

[–] Lowbird 13 points 11 months ago (3 children)

I agree it's murky. Though I'd like to note that when you shift hateful ideologues to dark corners of the internet, that also means making space in the main forums for people who would otherwise be forced out by the aforementioned ideologues - women, trans folks, BIPOC folks, anyone who would like to discuss xyz topic but not at the cost of the distress that results from sharing a space with hateful actors.

When the worst of the internet is given free reign to run rampant, it has a tendency to take over the space entirely with hate speech because everything and everyone else leaves instead of putting up with abuse, and those who do stay get stuck having the same, rock bottom level conversations (e.g. those in which the targets of the hate are asked to justify their existence or presence or right to have opinions repeatedly) over and over with people who aren't really interested in intellectual discussions or solving actual problems or making art that isn't about hatred.

But yeah, as with anything involving large groups of people, these things get complicated and can be unpredictable.

load more comments (3 replies)
[–] Penguincoder 11 points 11 months ago

The problem when you own a space that if you let certain groups of people in, such as, in this example, Nazis, you'll literally drive everyone else away from your space, so that what started off as a normal, ordinary space will become, essentially, a Nazi bar.

It's not only Nazis — it can be fascists, white supremacists, meth-heads, PUAs, cryptocurrency fanboys — some groups will be so odious to others that they will drive everyone else from your space, so the only solution that you can enact is to ensure that they don't come to your place, even if they're nice and polite and "follow your rules", because while they might, their friends won't, those friends have a history of driving away other people from other spaces.

"you have to nip it in the bud immediately. These guys come in and it's always a nice, polite one. And you serve them because you don't want to cause a scene. And then they become a regular and after awhile they bring a friend. And that dude is cool too.

And then THEY bring friends and the friends bring friends and they stop being cool and then you realize, oh shit, this is a Nazi bar now. And it's too late because they're entrenched and if you try to kick them out, they cause a PROBLEM. So you have to shut them down.

[–] thesmokingman@programming.dev 8 points 11 months ago

What evidence did you find to support Substack’s claims? They didn’t share any.

You can quickly and easily find good evidence for things like Reddit quarantining and the banning of folks like Alex Jones and Milo Yiannopoulos.

Which claims are empirical again?

load more comments (2 replies)
[–] garrett@infosec.pub 14 points 11 months ago (1 children)

I always hate policy talk trying to split the hairs of Nazism and “calls for violence”.

Even worse, I just can’t get allowing monetization. If you truly “hate the views”, stop lining your pocket with their money…

[–] Kichae@lemmy.ca 12 points 11 months ago

The only thing they hate is not taking their money.

[–] ArugulaZ@kbin.social 13 points 11 months ago

There are too many of these goddamned social networks anyway. After Twitter/X exploded, everyone else wanted to grab a piece of that pie, and now we've got a dozen social networks nobody uses.

If you want a progressive social network that doesn't take shit from goosesteppers, Cohost is probably the place to go. It's so neurodivergent and trans-friendly that I can't imagine them blithely accepting Nazi content. It's just not how Cohost works. "Blah blah blah, free speech!" Not here, chumps. We've got standards. Go somewhere else to push that poison.

[–] Zworf 8 points 11 months ago (4 children)

Substack started so well... It was looking like the new Medium (after medium totally enshittified). But the discovery was never very good there, and now this. Nope. Not going to blog there.

I wonder if Snowden still supports them.

load more comments (4 replies)
[–] drwho 5 points 11 months ago

Does nobody remember the exact same dust-up the day after Substack launched?

[–] Plume 4 points 11 months ago

So they are complacent with it very well. If you are complacent with Nazis, to me, you're a Nazi. I don't give a shit. What's the saying that the Germans have? Like there are six guys at a table in a bar and one of them is a Nazis, therefore there are six Nazis at the table? Yeah, that.

[–] autotldr@lemmings.world 3 points 11 months ago

🤖 I'm a bot that provides automatic summaries for articles:

Click here to see the summaryWhile McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation.

In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions.

“We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said.

In a 2020 letter from Substack leaders, including Best and McKenzie, the company wrote, “We just disagree with those who would seek to tightly constrain the bounds of acceptable discourse.”

The Atlantic also pointed out an episode of McKenzie’s podcast with a guest, Richard Hanania, who has published racist views under a pseudonym.

McKenzie does, however, cite another Substack author who describes its approach to extremism as one that is “working the best.” What it’s being compared to, or by what measure, is left up to the reader’s interpretation.


Saved 57% of original text.

[–] jarfil 3 points 11 months ago* (last edited 11 months ago) (1 children)

As always, there are several different aspects:

  • Promoting [Nazi propaganda and misinformation]
  • Laughing at [...]
  • Analyzing [...]
  • Profiting from [...]

Sometimes the difference between "promotion", "laughing at", and "analysis", depends more on the reader's approach, that on the writer's intent.

Then again, sometimes a reader decides they don't want to deal with any of it, which is also respectable.

[–] Butterbee 16 points 11 months ago (18 children)

Look. If there are 9 people at a table sitting with 1 Nazi and hanging out, there are 10 Nazis.

load more comments (18 replies)
load more comments
view more: next ›