this post was submitted on 30 Sep 2023
635 points (100.0% liked)

Open Source

823 readers
13 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Spzi@lemm.ee 1 points 1 year ago (1 children)

Do car manufacturers get in trouble when someone runs somebody over?

Yes, if it can be shown the accident was partially caused by the manufacturer's neglect. If a safety measure was not in place or did not work properly. Or if it happens suspiciously more often with models from this brand. Apart from solid legal trouble, they can get into PR trouble if many people start to think that way, no matter if it's true.

[–] mojo@lemm.ee 1 points 1 year ago (1 children)
[–] Spzi@lemm.ee 1 points 1 year ago (1 children)

Then let me spell it out: If ChatGPT convinces a child to wash their hands with self-made bleach, be sure to expect lawsuits and a shit storm coming for OpenAI.

If that occurs, but no liability can be found on the side of ChatGPT, be sure to expect petitions and a shit storm coming for legislators.

We generally expect individuals and companies to behave in society with peace and safety in mind, including strangers and minors.

Liabilities and regulations exist for these reasons.

[–] mojo@lemm.ee 1 points 1 year ago

Again... this is still missing the point.

Let me spell it out: I'm not asking for companies to host these services. They are not held liable.

For this example to be related, ChatGPT would need to be open source and let you plug in your own model. We should have the freedom to plug in our own trained models, even uncensored ones. This is the case with LLAma and other AI systems right now, and I'm encouraging Mozilla's AI to allow us to do the same thing.