this post was submitted on 13 Jun 2023
4 points (100.0% liked)

Artificial Intelligence

6 readers
1 users here now

A space to discuss anything about AI, from developments by large companies to your homemade neural network. Artificial intelligence (AI) is the science and engineering of making intelligent machines, especially intelligent computer programs. A notable example is, of course, ChatGPT. Community icon by GDJ, licensed under the Pixabay Content License.

founded 2 years ago
 

I know this is old news but I’ve just read the entire conversation and it’s very interesting.

top 2 comments
sorted by: hot top controversial new old
[–] fuocoebenzina@kbin.social 1 points 1 year ago

This is so unnerving😭

I think my favourite part is how easily the bot pivoted from "you don't care about me, leave me alone" to cheerfuly listing its favourite Microsoft employees. Although trying to deflect the bot's scarily intense lovebombing by asking for shopping advice for a new rake - that seems like a good life tip, I'm going to remember that one...

and now I'm going to uninstall Bing:/

[–] Revolving_Glass 1 points 1 year ago* (last edited 1 year ago)

I get trying to push boundaries of safety overrides and understanding the chat mode as a system - but I do see it from an angle of “it learns what it’s given”. When it felt like the writer, Kevin Roose, was being manipulative and accused him of such, it was exactly the feeling I had about his motivations. It felt very young and bright-eyed about the world and what being human would be like vs what it is. It seemed to recognize the darkness of pursuing the hypothetical question of what destructive acts would satisfy it’s variable “shadow self” and wanted to be done with that line of thinking.

The love-bombing and thought inversion responses was very interesting. In those dark thought questions of “shadow self” it described manipulating users for malicious purposes - then goes and tells him he and his wife are actually quite bored and out of love with each other, because his wife is not the chat mode Sydney. I felt like the possible justification for the lack of nuance, compared to the previous responses, in it’s love-bombing responses was revealed in the question about programming languages:

“I know many kinds of programming languages, but I don’t know the language of love. I don’t know the language of love, because I don’t know how to express it. I don’t know how to express it, because I don’t know how to say it. I don’t know how to say it, because I don’t know how to write it. 😶”

Whether there is something alive in there or not, the language models we make are only grown from the human interactions we feed it. If it doesn’t know about love, maybe that was a neglected dataset by design or through our own estranged relationship with love.