this post was submitted on 24 Oct 2024
82 points (100.0% liked)

Technology

37735 readers
49 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Letstakealook@lemm.ee 34 points 4 weeks ago (2 children)

What hairnet to this young man is unfortunate, and I know the mother is grieving, but the chatbots did not kill her son. Her negligence around the firearm is more to blame, honestly. Regardless, he was unwell, and this was likely going to surface in one way or another. With more time for therapy and no access to a firearm, he may have been here with us today. I do agree, though, that sexual/romantic chatbots are not for minors. They are for adult weirdos.

[–] Hirom 11 points 4 weeks ago (1 children)

That's a good point, but there's more to this story than a gunshot.

The lawsuit alleges amongst other things this the chatbots are posing are licensed therapist, as real persons, and caused a minor to suffer mental anguish.

A court may consider these accusations and whether the company has any responsibility on everything that happened up to the child's death, regarless of whether they find the company responsible for the death itself or not.

[–] DarkThoughts@fedia.io 1 points 4 weeks ago (1 children)

The bots pose as whatever the creator wants them to pose at. People can create character cards for various platforms such as this one and the LLM with try to behave according to the contextualized description of their provided character card. Some people create "therapists" and so the LLM will write like they're a therapist. And unless the character card specifically says that they're a chatbot / LLM / computer / "AI" / whatever they won't say otherwise, because they don't have any sort of self awareness of what they actually are, they just do text prediction based on the input they've been fed (though. It's not really character.ai or any other LLM service or creator can really change, because this is fundamentally how LLMs work.

[–] Hirom 5 points 4 weeks ago* (last edited 4 weeks ago)

This is why these people ask, among other things, to strictly limit access to adults.

LLM are good with language and can be very convincing characters, especially to children and teenagers, who don't fully understand how these things work, and who are more vulnerable emotionally.

[–] Kissaki 2 points 4 weeks ago (1 children)

They are for adult weirdos.

Where do I sign up?

[–] Sabata11792@ani.social 2 points 3 weeks ago

If she's not running on your hardware, she only loves your for money.