this post was submitted on 22 Aug 2023
96 points (100.0% liked)

Technology

37742 readers
73 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

shared via https://feddit.de/post/2805371

Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

you are viewing a single comment's thread
view the rest of the comments
[–] MaggiWuerze@feddit.de 1 points 1 year ago (2 children)

On the other hand, this could be used to create material that did not need new suffering. So it might reduce the need for actual children to be abused for the production of it.

[–] ram@lemmy.ca 4 points 1 year ago (4 children)

Ya, no, those people need psychological help. Not to feed the beast. This is nonsense.

[–] MaggiWuerze@feddit.de 1 points 1 year ago (1 children)

Sure they do, but if they have to consume would you rather a real child had to suffer for that or just an Ai generated one?

[–] ram@lemmy.ca 1 points 1 year ago (1 children)

Neither. I would have mental health supports that are accessible to them.

[–] tweeks@feddit.nl 2 points 1 year ago

Of course we don't want both, but it comes across as if you're dismissing a possible direction to a solution to the one that is definitely worse (real life suffering) by a purely emotional knee jerk.

Mental health support is available and real CSAM is still being generated. I'd suggest we look into both options; advancing ways therapists can help and perhaps at least have an open discussion about these sensitive solutions that might feel counter-intuitive at first.

load more comments (3 replies)
[–] tweeks@feddit.nl 1 points 1 year ago

That's a fair point. And I believe AI should be able to combine legal material to create illegal material. Although this still feels wrong, if it excludes suffering in base material and reduces future (child) suffering, I'd say we should do research on it at least. Even if it's controversial, we need to look at the rationale behind it.