this post was submitted on 02 Jul 2023
110 points (100.0% liked)

Gaming

30564 readers
13 users here now

From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!

Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.

See also Gaming's sister community Tabletop Gaming.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Interesting decision

you are viewing a single comment's thread
view the rest of the comments
[–] millie 46 points 1 year ago (9 children)

I feel like this is less of a big decision and more of a 'duh' sort of situation. To my understanding this isn't saying that all AI art violates copyright, but that AI art which does violate copyright can't be used.

Like if i took a picture of Darth Vader and handed it to NightCafe to fool around with, that still belongs to Disney. Steam is legally required to act if a valid DMCA is sent, and to adhere to the court's ruling in the case of a dispute.

I feel like this is a reassurance that they intend to obey copyright law rather than a restriction of all AI art. Basically they're saying that if you DMCA someone in good faith on the basis of derivative works, they'll play ball.

[–] Dominic 20 points 1 year ago* (last edited 1 year ago) (2 children)

Right, the phrasing is “copyright-infringing AI assets” rather than a much more controversial “all AI assets, due to copyright-infringement concerns.”

I do think there’s a bigger discussion that we need to have about the ethics and legality of AI training and generation. These models can reproduce exact copies of existing works (see: Speak, Memory: An Archaeology of Books Known to ChatGPT/GPT-4).

[–] millie 10 points 1 year ago (2 children)

Sure, but plagiarism isn't unique to LLMs. I could get an AI to produce something preexisting word for word, but that's on my use of the model, not on the LLM.

I get the concerns about extrapolating how to create works similar to those made by humans from actual human works, but that's how people learn to make stuff too. We experience art and learn from it in order to enrich our lives, and to progress as artists ourselves.

To me, the power put into the hands of creators to work without the need for corporate interference is well worth the consideration of LLMs learning from the things we're all putting out there in public.

[–] Dominic 5 points 1 year ago (1 children)

That’s a fair point.

In my eyes, the difference is the sheer volume of content that these models rip through in training. It would take many, many lifetimes for a person to read as much as an LLM “reads,” and it’s difficult to tell what an LLM is actually synthesizing versus copying.

Now, does it really matter?

I think the answer comes down to how much power is actually put into the hands of artists rather than the mega-corps. As it stands, the leaders of the AI race are OpenAI/Microsoft, Google, and Meta. If an open LLM comes out (a la Stable Diffusion), then artists do stand to benefit here.

[–] millie 2 points 1 year ago (1 children)

I mean, they are and they aren't. OpenAI, Google, and Meta may be in control of the most popular iterations of LLMs at the moment, but the cat's also kind of out of the bag. If we all lost access to ChatGPT and other AI stuff that's dependent on it over-night, there's a pretty huge incentive to fill that gap.

They control it now because they've filled people's emerging need for LLMs to assist in their workflow. If they try to choke that off as though they own it in a wider sense, they're going to find their power over it turning to ash in their mouths and someone else will take their place.

I'm optimistic that the trend of cracks developing in the authoritarian power structures we're seeing on the internet won't stay limited to there. LLMs could be a big part of that. Even just Stable Diffusion being Open Source is massive. I'm sure others will follow, and those big companies, if they're smarter than Elon and Spez, will want to hang onto their relevance as long as possible by not shunting users over to FOSS.

[–] Dominic 1 points 1 year ago

Absolutely true. StableLM is a thing, and although it’s not as good as ChatGPT or even Bard, it’s a huge step. I’m sure there will be better free models in the months and years to come.

[–] TeddyTi@lemmy.blahaj.zone 1 points 1 year ago

It's not unique to LLMs, but the issues are always the same: how to check if there is plagiatism, and who to blame.

[–] interolivary 4 points 1 year ago* (last edited 1 year ago) (1 children)

I just don't see how this is different from "Valve won't publish games that feature copyright-infringing assets" which is probably already true. Does it matter whether a human or an "AI" produced it?

[–] Pseu 3 points 1 year ago (1 children)

Probably not. But there is a pretty widespread belief that images generated by AI cannot possibly be infringing, because the model is somehow inherently transformative.

This is not the case, and Valve reiterating that it is not the case might keep developers who are under the impression above from trying.

[–] Umbrias 1 points 1 year ago

I have mostly ever seen the exact opposite position: that AI cannot possibly produce anything not copyright infringing. It's hard to remember a time someone was claiming that a given artwork produced by AI could never be copyright infringing except among like, cryptobros.

load more comments (6 replies)