this post was submitted on 25 Oct 2023
127 points (100.0% liked)

Technology

37727 readers
62 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] amju_wolf@pawb.social 11 points 1 year ago (1 children)

I mean if you train a model on porn with adult actors and on regular photos with children, it shouldn't be hard to generate the combination.

You probably wouldn't even need any fancy training data but if you really wanted you could pick adult actors that look young or in other ways similar to the children to help the process.

[–] barsoap@lemm.ee 3 points 1 year ago (2 children)

Knowing what a nude adult looks like doesn't mean that the model knows what a nude child looks like. I'm quite sure it's easy to generate disturbing images like that, but actual paedophiles I think won't be satisfied with child faces on small adult bodies.

Ordinary deepfakes actually have a very similar problem: Sure you can take a picture of a celebrity and tell the AI to undress them -- but it won't be their actual body. The AI is going to be able to approximate their overall build but it's going to be a generic adult body, not the celebrity's body. Or, differently put, AI models aren't any better at undressing people with their eyes than teenagers.

[–] amju_wolf@pawb.social 3 points 1 year ago (1 children)

I see where you're coming from but that's a technical issue that will probably be solved in time.

It's also really not a black and white; sure maybe you can see it isn't perfect but you'd still prefer it to content where you know no one was actually harmed.

Despite what reputation people like that have (due to the simple fact of how reporting works), most are harmless like me and you and don't actually want to see innocent people suffer and would never act on their desires. So having a safe and harmless outlet might help.

[–] barsoap@lemm.ee 4 points 1 year ago* (last edited 1 year ago) (1 children)

I see where you’re coming from but that’s a technical issue that will probably be solved in time.

You cannot create information from nothing.

So having a safe and harmless outlet might help.

Psychologists/Psychiatrists are still on the fence on that one, I wouldn't be surprised if it depends on the person. And yes the external harm produced by AI images is definitely lower than that produced from actual CSAM, doubly so newly produced CSAM, but that doesn't mean that therapy, even in its current early stages, couldn't do even better.

Differently put: We may be again falling into the trap of trying to find technological solutions to societal problems (well, this is /c/technology...). Which isn't to say that we shouldn't care at all about models trained on CSAM, but that's addressing symptoms, not causes. Ultimately addressing root causes is more important: The vast majority of paedophiles are not exclusive paedophiles, often they're not even really attracted to kids at all beyond having developed a fetish, they're rapists focussing on the most vulnerable, often due to having been victims of sexual abuse themselves.

[–] amju_wolf@pawb.social 2 points 1 year ago (1 children)

You cannot create information from nothing.

Arguably that's exactly what generative AIs do. Which is not what you meant, but yeah. I was going more for like "given current progress and advancements in how we curate datasets and whatnot, there is no reason to believe that we won't have 100% undistinguishable AI-generated pictures eventually".

We already know that you don't need to have stuff in the training dataset to have it show up meaningfully in the output.

Psychologists/Psychiatrists are still on the fence on that one, I wouldn’t be surprised if it depends on the person. And yes the external harm produced by AI images is definitely lower than that produced from actual CSAM, doubly so newly produced CSAM, but that doesn’t mean that therapy, even in its current early stages, couldn’t do even better.

100% agree there. What I would like to see is more research, but that's currently kinda impossible with CSAM being as criminalized as it is. Which is kinda sad.

Therapy seems to work on most help-seeking people (and there are studies proving that), so this should be a last ditch effort.

The rest of your post I don't agree with. It isn't really (definitely not exclusively) a societal problem - some people's brains are simply wired in a way that's just bad and there isn't much you can do with it, and either these people suffer by living with it, or they cause harm to others because of it. Both is bad.

The vast majority of paedophiles are not exclusive paedophiles, often they’re not even really attracted to kids at all beyond having developed a fetish, they’re rapists focussing on the most vulnerable, often due to having been victims of sexual abuse themselves.

Do you have any statistics proving this? It's exactly the bias that already makes non-acting pedophiles unlikely to seek help. Obviously these kinds of people are the ones you hear most about, but I wouldn't be so sure that they're the majority (even if they're most of the problem).

My point is that if you take it as people who need help and actually manage to provide it, you should be able to get the number of abuse down overall except for the people who truly can't be helped. And it really doesn't matter much how you provide that help, even if it's morally questionable like using artificially generated CSAM.

[–] barsoap@lemm.ee 1 points 1 year ago

Do you have any statistics proving this?

All my knowledge about this stuff goes back to what 2010, in the wake of this shit. I'm quite sure it's actual medical statistics though don't ask me where to find those 13 years down the line.

My point is that if you take it as people who need help and actually manage to provide it

We do actually have a programme specifically for this in Germany. Attempting to make run off the mill psychologists provide that kind of therapy isn't viable: The general issue is utter lack of rapport when your therapist can't decide whether they'd like to barf or strangle you.

[–] artaxadepressedhorse@lemmyngs.social 2 points 1 year ago (1 children)

I dunno, you seen the stats on popularity of shemale porn? Pretty sure the human brain isn't that picky. It goes: "boobs check. Cock insertion check."

[–] barsoap@lemm.ee 2 points 1 year ago (1 children)

That's a bisexual/bicurious double-whammy, not really comparable.

[–] artaxadepressedhorse@lemmyngs.social 1 points 1 year ago (1 children)

I don't find men attractive at all and yet shemale porn gives me teh chubs

[–] barsoap@lemm.ee 1 points 1 year ago

I... don't care. Also you can find cock attractive without being into men. Or only find femboys attractive, but not others.