this post was submitted on 12 Jun 2023
4 points (100.0% liked)
Fediverse
287 readers
1 users here now
This magazine is dedicated to discussions on the federated social networking ecosystem, which includes decentralized and open-source social media platforms. Whether you are a user, developer, or simply interested in the concept of decentralized social media, this is the place for you. Here you can share your knowledge, ask questions, and engage in discussions on topics such as the benefits and challenges of decentralized social media, new and existing federated platforms, and more. From the latest developments and trends to ethical considerations and the future of federated social media, this category covers a wide range of topics related to the Fediverse.
founded 2 years ago
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Centralization has another aspect that is simultaneously both good and bad: you can easily remove offensive content and problematic users. A centralized approach makes it very easy to remove cancerous people, groups, and content, while a decentralized approach makes that far harder. But in a centralized system, who defines what is cancerous content, et al.? Reddit did a great job at removing racist content, for instance (or, if you go back farther, they removed 'jailbait' and 'creepshots' communities, which were producing content that was just on the line of being obscene). But they also took a "both sides are bad" approach when it came to literal nazis v. antifascists.
I'm a Reddit refugee, so it's going to take me a while to learn to navigate this. And yeah, I've been kicked off Twitter, so Mastodon was already on my radar.
I believe disassociating from Nazis, CSAM, etc is still very possible in a distributed network like this, the instance admin just blacklists the instances they don't want to interact with. But it requires the user to find the server that best aligns with what they want to see. A centralized admin won't do it for them.