The point of CSAM scanners is not to protect children, but to circumvent due process by expanding warrantless surveillance. That is antithetical to FOSS.
So, in a word, no.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
[Matrix/Element]Dead
much thanks to @gary_host_laptop for the logo design :)
The point of CSAM scanners is not to protect children, but to circumvent due process by expanding warrantless surveillance. That is antithetical to FOSS.
So, in a word, no.
So you like child porn? I want a way to block bad content from being received and displayed
You have to rely on a 3rd party that provides you hashes or what not to identify images. And that is a business model. Or you could create a DB with hashes (aka getting a yourself) I think you will bring you in all kinds of legal troubleds this way. Or you create a algo for that work and burn through a hell lot of GPU hours (welcome back to a business model)
There Is a tool that someone built directly to scan images uploaded to lemmy for CSAM.
It is really quite clever. The image is put through a ML/AI model, which describes it (Imange to text), then the text is reviewed against a set of rules to see if it has the hallmarks of CSAM. If it does, it is deleted.
This is fully self hosted.
What I like is that it avoids the trauma of a person having to see those sort of things
Poor guy who had to define the rules.
you mean the ML model?
I dont think it is too bad, as it is more like look for a description that has children and a sexual context. This can be trained without CSAM as the model generalises situations it has seen before - a pornographic picture (sexual context) and kids playing at a platground (children in the scene).