this post was submitted on 28 Aug 2023
549 points (100.0% liked)

Meta (lemm.ee)

36 readers
4 users here now

lemm.ee Meta

This is a community for discussion about this particular Lemmy instance.

News and updates about lemm.ee will be posted here, so if that's something that interests you, make sure to subscribe!


Rules:


If you're a Discord user, you can also join our Discord server: https://discord.gg/XM9nZwUn9K

Discord is only a back-up channel, !meta@lemm.ee will always be the main place for lemm.ee communications.


If you need help with anything, please post in !support instead.

founded 1 year ago
MODERATORS
 

Sorry for the short post, I'm not able to make it nice with full context at the moment, but I want to quickly get this announcement out to prevent confusion:

Unfortunately, people are uploading child sexual abuse images on some instances (apparently as a form of attack against Lemmy). I am taking some steps to prevent such content from making it onto lemm.ee servers. As one preventative measure, I am disabling all image uploads on lemm.ee until further notice - this is to ensure that lemm.ee can not be used as gateway to spread CSAM into the network.

It will not possible to upload any new avatars or banners while this limit is in effect.

I'm really sorry for the disruption, it's a necessary trade-off for now until we figure out the way forward.

you are viewing a single comment's thread
view the rest of the comments
[–] iByteABit@lemm.ee 14 points 1 year ago (6 children)

This is a very good decision, I worried about this problem from the very beginning that I learned about the Fediverse. Research must definitely be done to find CSAM detection tools that integrate into Lemmy, perhaps we could make a separate bridge repo that integrates a tool like that easily into the codebase.

I hope every disgusting creature that uploads that shit gets locked up

[–] OverfedRaccoon@lemm.ee 4 points 1 year ago (5 children)

There was a user that posted a tool they had already been working on, that worked in Python, to go through and automate detection/deletion of potential CSAM on Lemmy servers that admins could implement until better tools come along. Unfortunately, I don't remember who posted it or where I saw it in my travels yesterday.

[–] quinacridone@lemmy.ml 4 points 1 year ago (2 children)
[–] OverfedRaccoon@lemm.ee 4 points 1 year ago (1 children)
[–] quinacridone@lemmy.ml 4 points 1 year ago

You're welcome

load more comments (2 replies)
load more comments (2 replies)