this post was submitted on 01 Sep 2023
580 points (100.0% liked)
Memes
1357 readers
12 users here now
Rules:
- Be civil and nice.
- Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
On feddit.de, lemmy.world is only temporarily defederated because of CSAM until a patch is merged into Lemmy that prevents images from being downloaded to your own instance.
So I'll just be patient and wait. It's understandable the admins don't want to get problems with law enforcement.
Makes quite a bit of sense
Depending on jurisdiction it can be pretty hairy if your instance downloads it
IANAL but I'm pretty sure that in the US you have a "duty to report" and you can have legal protections if you end up getting it and then reporting it
But IANAL so I'd recommend looking into it with an actual lawyer if you run a website that hosts content
Won't that lead to some horrible hug-of-death type scenarios if a post from a small instance gets popular on a huge one?
Yes, but arguably it was never very scalable for federated software to store large media. It gets utterly massive quick. Third party image/video hosts that specialize in hosting those things can do a better job. And honestly, that's the kinda data that is just better suited for centralization. Many people can afford to spin up a server that mostly just stores text and deals with basic interactions. Large images or streaming video gets expensive fast, especially if the site were to ever get even remotely close to reddit levels.
If you're only responsible for caching for your own users, you don't unduly burden smaller instances.
We need more decentralization, a federated image/gif host with CSAM protections
How would one realize CSAM protection? You'd need actual ML to check for it, and I do not think there are trained models available. And now find someone that wants to train such a model, somehow. Also, running an ML model would be quite expensive in energy and hardware.
There are models for detecting adult material, idk how well they’d work on CSAM though. Additionally, there exists a hash identification system for known images, idk if it’s available to the public, but I know apple has it.
Idk, but we gotta figure out something
Feddit is defed from so many instances it's actually not usable for me.