this post was submitted on 05 Aug 2023
215 points (100.0% liked)

Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ

1444 readers
48 users here now

⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules • Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

📜 c/Piracy Wiki (Community Edition):


💰 Please help cover server costs.

Ko-Fi Liberapay
Ko-fi Liberapay

founded 1 year ago
MODERATORS
top 33 comments
sorted by: hot top controversial new old
[–] honey_im_meat_grinding@lemmy.blahaj.zone 41 points 1 year ago* (last edited 1 year ago) (2 children)

I sympathize with artists who might lose their income if AI becomes big, as an artist it's something that worries me too, but I don't think applying copyright to data sets is a long term good thing. Think about it, if copyright applies to AI data sets all that does is one thing: kill open source AI image generation. It'll just be a small thorn in the sides of corporations that want to use AI before eventually turning them into monopolies over the largest, most useful AI data sets in the world while no one else can afford to replicate that. They'll just pay us artists peanuts if anything at all, and use large platforms like Twitter, Facebook, Instagram, Artstation, and others who can change the terms of service to say any artist allows their uploaded art to be used for AI training - with an opt out hidden deep in the preferences if we're lucky. And if you want access to those data sources and licenses, you'll have to pay the platform something average people can't afford.

[–] Phanatik@kbin.social 18 points 1 year ago (2 children)

I completely disagree. The vast majority of people won't be using the open source tools unless the more popular ones become open source (which I don't think is likely). Also, a tool being open source doesn't mean it's allowed to trample over an artist's rights to their work.

They’ll just pay us artists peanuts if anything at all, and use large platforms like Twitter, Facebook, Instagram, Artstation, and others who can change the terms of service to say any artist allows their uploaded art to be used for AI training - with an opt out hidden deep in the preferences if we’re lucky.

This is going to happen anyway. Copyright law has to catch up and protect against this, just because they put it in their terms of service, doesn't mean it can't be legislated against.

This was the whole problem with OpenAI anyway. They decided to use the internet as their own personal dataset and are now charging for it.

[–] honey_im_meat_grinding@lemmy.blahaj.zone 7 points 1 year ago* (last edited 1 year ago) (1 children)

I get where you're coming from, but I don't think even more private property is the answer here. This is ultimately a question of economics - we don't like that a) we're being put out of jobs, and b) it's being done without our consent / anything in return. These are problems that we can address without throwing even more monopolosation power into the equation, which is what IP is all about - giving artists a monopoly over their own content, which mostly benefits large media corporations, not independent artists.

I'd much rather we tackled the problem of automation taking our jobs in a more heads on manner via something like UBI or negative income taxes, rather than a one-off solution like even more copyright that only really serves to slow this inevitability down. You can regulate AI in as many ways as you want, but that's adding a ton of meaningless friction to getting stuff done (e.g. you'd have to prove your art wasn't made by AI somehow) when the much easier and more effective solution is something like UBI.

The consent question is something that needs a bit more of a radical solution - like democratising work, something that Finland has done to their grocery stores, the biggest grocery chains are democratically owned and run by the members (consumer coops). We'll probably get to something like that on a large scale... eventually - but I think it's probably a bigger hurdle than UBI. Then you'd be able to vote on what ways an organisation operates, including if or how it builds AI data sets.

[–] archomrade@midwest.social 3 points 1 year ago

I appreciate this take, especially when applying copyright in the manner being proposed extends the already ambiguous grey area of "fair use", which is most often used against artists.

[–] Pulp@lemmy.dbzer0.com 3 points 1 year ago (2 children)

Who gives a shit about artists rights? We need to move on with the progress like we always have.

[–] restingboredface@wayfarershaven.eu 11 points 1 year ago (1 children)

This is a terrible take. Maybe someday your livelihood will be challenged by technology and you'll get to see why.

[–] Pulp@lemmy.dbzer0.com 1 points 1 year ago

They can get a job or figure out funding like everybody else

[–] Phanatik@kbin.social 5 points 1 year ago

We should give a shit about everyone's rights to put food on the table. Compassion can be exhausting but it's important to recognise that someone else's problem might be yours one day and you'd wish someone was there to help you.

[–] krnl386@lemmy.ca 7 points 1 year ago (1 children)

I sympathize with artists too, but to a point. I predict that:

  1. AI art will overtake human art eventually; that is human art jobs will be mostly replaced. Day to day art (e.g. ads, illustrations, decorations, billboards etc) will likely be AI generated.
  2. Human art will become something akin to a home cooked meal in a sea of fast food art. This might actually make some artists famous and rich.
  3. Humans will continue to learn art, but more as a pastime/hobby/mental exercise.
[–] ParsnipWitch@feddit.de 4 points 1 year ago* (last edited 1 year ago)

For point 2 and 3 art is too expensive and time consuming to learn. I feel a lot of people extremely underestimate the time and cost that people have to bring up to become decent artists.

[–] mtchristo@lemm.ee 31 points 1 year ago (2 children)

So Japan is telling us, that intellectual property is holding back its progress in AI. so are they recognizing that IP is a hinderess to progress and innovation ? should we expect this to nullify other IP legislation ? is this heading to court?

[–] Dalinar@lemmy.nz 5 points 1 year ago (1 children)
[–] SnowBunting@lemmy.ml 1 points 1 year ago

Oh sweet. Time to dis out some images that look similar not not quite.

load more comments (1 replies)
[–] ox0r@jlai.lu 21 points 1 year ago

My AI trained torrent client will be very happy to hear this

[–] Gutless2615@ttrpg.network 17 points 1 year ago* (last edited 1 year ago) (3 children)

The absolute right decision. Generative art is a fair use machine, not a plagiarism one. We need more fair use, not less.

[–] donuts@kbin.social 22 points 1 year ago (1 children)

Not at all... In fact, it's totally batshit insane to determine that the biggest tech companies in the world can freely use anybody's copyrighted data or intellectual property to train an AI and then claim to have ownership over the output.

The only way that it makes sense to have AI training be "fair use" is if the output of AI is not able to be copyrighted or commercially used, and that's not the case here. This decision will only enable a mass, industrialized exploitation of workers, artists and creators.

[–] Gutless2615@ttrpg.network 8 points 1 year ago

Expanding on the already expansive terms of copyright is not the appropriate way to deal with the externalities of AI. This copyright maximalists approach will hurt small artists, remix culture, drive up business costs for artists who will be dragged into court to prove their workflows didn’t involve any generative steps, and as with every expansion of copyright, primarily help the large already centralized corporate IP holders to further cement their position.

[–] lowleveldata@programming.dev 15 points 1 year ago (2 children)

It's not the right decision for the content creators. So it's not "absolute right".

[–] Gutless2615@ttrpg.network 13 points 1 year ago* (last edited 1 year ago)

Expanding the terms of copyright to 70 years after the life of the author actually didn’t help artists make art. Expanding copyright to cover “training” will result in more costly litigation, make things harder for small artists and creators, and further centralize the corporate IP hoarders that can afford to shoulder the increased costs of doing business. There are inumerable content creators that could and will make use of generative art to make content and they should be allowed to prosper. We need more fair use, not less.

[–] wolfshadowheart@kbin.social 10 points 1 year ago* (last edited 1 year ago)

That's not true? There's nothing stopping content creators from using their own content to create models. In fact, that's my exact project for some of my visual art.

Moreover (edit: visual) models can't effectively replicate the copywrite, so I don't really see how it would infringe on it.

[–] RyanHeffronPhoto@kbin.social 6 points 1 year ago (1 children)

@Gutless2615 corperations stealing artists work to develop their for-profit software is NOT fair use.

[–] Gutless2615@ttrpg.network 6 points 1 year ago (1 children)

You do realize individuals can train neural networks on their own hardware, right? Generative art and generative text is not something owned by corporations — and in fact what is optimistically becoming apparent is that it is specifically difficult to build moats around a generative model, meaning that it’s especially hard for for corporations to own this technology outright — but those corporations are the only ones that benefit from expanding copyright. Also, I disagree with you also. A trained model is a transformative work, as are the works you can generate with those models. Applying the four factor fair use test comes out heavily on the side of fair use.

[–] RyanHeffronPhoto@kbin.social 1 points 1 year ago (1 children)

@Gutless2615 Of course individuals can train models on their own work, but if they train it on other artists work, that too is an unauthorized use.

Honestly whether AI outputs can be copyrighted is really a separate issue from what I am concerned about.. what matters in these cases is where/ how they obtained the inputs on which they trained the models. If a corporation or individual is using other artists works without authorization they are also committing theft, irrespective of any copyright infringement.

[–] Gutless2615@ttrpg.network 1 points 1 year ago* (last edited 1 year ago)

And while we’re at it let’s throw out mashup artists, collages, remixes and fair use altogether, huh? You’re just incorrect here, fair use exists for a reason, and applying the four factor fair use test to generative art comes out on the side of fair use nine times out of ten. What’s more, what you’re arguing for will only make it harder for small artists who get spurious accusations lobbed their way or automated take downs from bad “ai detector” software and have to drag out in progress files and lawyer money to argue they didn’t use generative tools in their workflow. There are better ways to make sure artists can still get paid - and, spoiler alert: it’s not just the artists that are going to get hit. We need to embrace more creative solutions to the problems of AI than “copyright harder”

[–] jerkface@lemmy.ca 15 points 1 year ago (2 children)

I'm not thrilled that copyright exists and that it is used as a weapon against innovation and artistic expression. But if it's going to exist, I want it to actually fucking protect my works.

[–] BloodForTheBloodGod@lemmy.ca 9 points 1 year ago

So this is a step in the right direction, then.

[–] KitsuneHaiku@ttrpg.network 7 points 1 year ago

Copyright has never worked unless you have a lawyer to enforce it.

[–] ozoned 13 points 1 year ago

So if the work they used to train it isn't a copyright violation canthr things it creates be copyrighted? I hate copyright. It doesn't protect the people it should. Public domain everything that these AI create, companies will stay away, and we support creators directly.

[–] coolin 11 points 1 year ago

Sam Altman: We are moving our headquarters to Japan

[–] krnl386@lemmy.ca 7 points 1 year ago

Well, they should prepare for a crapton of new datacenters to be built there. 😂

[–] Sir_Kevin@discuss.online 7 points 1 year ago (1 children)

Smart move. They also clearly understand that AI is here to stay and it's better to embrace it rather than fight it. This will give Japan an unhindered advantage while the rest of the world cries over who allowed a computer to look at their artwork.

[–] Alto@kbin.social 10 points 1 year ago (1 children)

Someone didn't click on the article.

[–] reflex@kbin.social 2 points 1 year ago* (last edited 1 year ago)

Someone didn't click on the article.

@Alto with the mic drop.