this post was submitted on 22 Aug 2023
96 points (100.0% liked)

Technology

37742 readers
73 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

shared via https://feddit.de/post/2805371

Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

you are viewing a single comment's thread
view the rest of the comments
[–] jordanlund@lemmy.one 26 points 1 year ago (2 children)

You say that NOW, but if people start using your images to generate revenge porn or, you know, really anything you didn't consent to, that's a huge problem.

Both for the people whose images were used to train the model and for the people whose images are generated using the models.

Non-consent is non-consent.

This is how you get the feds involved.

[–] ram@lemmy.ca 36 points 1 year ago (2 children)

Let's not forget that these AI aren't limited by age. Like fuck am I gonna be out here defending tech that would turn my kid into CSAM. Fucking disgusting.

[–] PelicanPersuader 14 points 1 year ago

Worse, people making AI CSAM will wind up causing police to waste resources investigating abuse that didn't happen, meaning those resource won't be used to save real children in actual danger.

[–] MaggiWuerze@feddit.de 1 points 1 year ago (2 children)

On the other hand, this could be used to create material that did not need new suffering. So it might reduce the need for actual children to be abused for the production of it.

[–] ram@lemmy.ca 4 points 1 year ago (4 children)

Ya, no, those people need psychological help. Not to feed the beast. This is nonsense.

[–] MaggiWuerze@feddit.de 1 points 1 year ago (1 children)

Sure they do, but if they have to consume would you rather a real child had to suffer for that or just an Ai generated one?

[–] ram@lemmy.ca 1 points 1 year ago (1 children)

Neither. I would have mental health supports that are accessible to them.

[–] tweeks@feddit.nl 2 points 1 year ago

Of course we don't want both, but it comes across as if you're dismissing a possible direction to a solution to the one that is definitely worse (real life suffering) by a purely emotional knee jerk.

Mental health support is available and real CSAM is still being generated. I'd suggest we look into both options; advancing ways therapists can help and perhaps at least have an open discussion about these sensitive solutions that might feel counter-intuitive at first.

load more comments (3 replies)
[–] tweeks@feddit.nl 1 points 1 year ago

That's a fair point. And I believe AI should be able to combine legal material to create illegal material. Although this still feels wrong, if it excludes suffering in base material and reduces future (child) suffering, I'd say we should do research on it at least. Even if it's controversial, we need to look at the rationale behind it.

[–] Evergreen5970 10 points 1 year ago

As someone who personally wouldn't care at all if someone made AI porn of me and masturbated to it, I am incredibly uncomfortable with the idea that someone who doesn't like me may have the option to generate AI porn of me having sex with a child. Now there's fake "proof" I'm a pedophile, and I get my life ruined for sex I never had, for violation of consent I never actually committed. Even if I'm vindicated in court, I might still be convicted in the court of public opinion. And people could post faked porn of me and send it to companies to try to say "Evergreen5970 is promiscuous, don't hire them." Not all of us have the luxury of being able to pick and choose between companies depending on whether they match our values, some of us have to take what they can get and sometimes that would include companies that would judge you for taking nude photos of yourself. It would feel especially bad given I'm a virgin by choice who has never taken nudes let alone sent them. Punished for something I didn't do.

Not everyone is going to restrict their use to their private wank sessions, to making a real image of the stuff they probably already envision in their imagination. Some will do their best to make its results public with the full intention of using it to do harm.

And once faking abuse with AI porn becomes well-known, it might discredit actual photographic/video proof of CSAM happening. Humans get fooled by whether an AI-generated image was taken by a human or generated by AI, and AI doesn't detect AI-generated images with a perfect accuracy rate. So the question becomes "how can we trust any image anymore?" Not to mention the ability to generate new CSAM with AI. Some more mainstream AI models might try to tweak algorithms to prevent people from generating any porn involving minors, but there'll probably always be some floating around with those guardrails turned off.

I'm also very wary of dismissing other peoples' discomfort just because I don't share it. I'm still worried for people who would care about someone making AI porn of them even if it was just to masturbate with and kept private.