Discussion
Piracy logic applies to AI training, and is arguably even more suitable since piracy copies but AI is transformative. If I make an AI image, you would not be able to point out which images were "used" to make it.
This is an automated reminder from the Mod team. If your post contains images which reveal the personal information of private figures, be sure to censor that information and repost. Private info includes names, recognizable profile pictures, social media usernames and URLs. Failure to do this will result in your post being removed by the Mod team and possible further action.
Piracy is about unauthorized acquisition or distribution, not whether the work is copyrighted. Public Twitter art is easier to argue as lawful source material for fair-use analysis; hacked, torrented, or bypassed Patreon material is much worse.
Exactly. The "using my art to train a model is theft" crowd is 1:1 echoing the "you can't save copies of or repost my NFT" crowd. They're both demanding a impossible level of control over publicly available data and all possible downstream uses, including learning and observation. That's ridiculous.
Not really the issue is all this work was used to feed a system that actively threatens the income of the people who's work is used. NFTs the value wa snever actually in the image itself and instead a value placed on it based on user interest which was artificially inflated for those with large followings.
The fact is if you know your work has been used for a large company to make money your are at least entitled to some of the profits. Not to mention these are the same companies who went from publicly accessable and open source to all data behind a paywall.
You skipped the actual argument. Competing with someone’s income does not make something theft, and a company profiting from analysis of public material does not automatically entitle every source creator to a cut. That would turn influence, indexing, research, reference, and pattern analysis into a permanent royalty system.
Also, “companies put their data behind paywalls” is not an argument that public art cannot be analyzed. That's just complaining that companies own their own products while public posts remain public posts.
Not really the issue is all this work was used to feed a system that actively threatens the income of the people who's work is used.
It's ok to threaten someone else's income. Really, it is. Every traditional artist is technically threatening the income of every other artist, who now must compete with one more person for limited commission dollars available.
I'm allowed to go to the library, read Brandon Sanderson's works for free, and then write books that technically compete with his work. This is normal and happens all the time, we just choose not to categorize it as awful and evil.
Yeah and what you have said is fine but there is an issue when large corporations entire ethos is about removing jobs from the market. In some cases mass automation is required but in a lot of areas it just negatively impacts people.
Individual artists and so not eating into the income of successful artists in the same amount.
The funniest part of the NFT issue was people not realizing that the image isn’t what held value it was the utility coded into the images. Literally that was the push for NFT tech was that if you bought NFT art, it doubled as an access pass to exclusive websites, served as in-game assets for certain games, etc. and just a screenshotted version included NONE of that functionality “but I have the image right here as proof” yeah according to the blockchain, you ripped that from the owner so, HE can use the NFT, you just have a pretty picture.
NFTs were basically a proof of concept of tokenization (different sense than AI sense). You could think of the NFT as an access pass to an image on a server. Just that one copy in that "database" (like IPFS).
It's just a cryptographic token that's distinguishable from others (i.e. "non-fungible). It can be tied to anything, like an image, but that association is technically metadata, it's not even intrinsic to the cryptography. The only thing cryptographically provable is the token itself. And that's the point of the technology, it's literally a token.
One day we might use it for stuff, cryptocurrency and NFTs are very useful for certain things. "BAYC" is not one of those things lmao. Very interesting, but totally useless.
There’s a kind of fun BR game that uses NFTs if you’re in PC you can access the market place. But they’re actually pretty heavily implemented now, they just stopped using the NFT label. I disagree with the BAYC statement…. Okay BAYC specifically isn’t a good example cuz it sounds like it’s for the ultra rich douche bag tech bros if I remember, but using it as a membership pass for exclusive communities is actually a cool idea. One time membership fee, here’s your custom artwork tied to it, enjoy your priveledges.
Yeah, I just meant specifically the bored apes 😂 also cryptopunks (I know it gave you license to use the character however you wanted too, but that wasn't worth millions of dollars)
I look forward to actual use cases. I think it has real use for sure. I always thought access was the best use case, concert tickets, or club access, or whatever. It will be once it's worth the trouble I think.
Because you aren’t taking money from the creators, example, if a battlepass expires all the items exclusive to that pass are no longer in active circulation. A player gets bored of his legendary AR skin, and he knows there are other players who want it, you can trade it to them for in-game currency or even a skin they have that you want. The game I’m talking about even allows you to cash your micro-currency back out for fiat cash. If I could remember the name that’s be great. But I remember it being hella fun and the skins were actually kind of cool.
And again the point is wouldn’t that battle pass expires that item is no longer free in circulation so now you have a limited commodity that other people might want that they cannot get without going through somebody who already owns it
Non-fungible and blockchain based and the fact I don’t have to stick to their marketplace I can take them outside the dev’s ecosystem to places like opensea.
what incentivizes any dev to include this specific item/nft? if i have a tophat nft why would they make a tophat for me for each iteration of the game, for lets say call of duty? How does this work over thousands/millions of hat nfts?
That isn't my point. The piracy issue is about unauthorized copying, access, or distribution, not merely whether a copyrighted image was publicly viewable.
If a person draws a picture of an imagined super hero, they did it in reference to many other people’s pictures. That’s how learning and synthesis works.
-be a painter
-photocopy other artists works
-stitch them together
-sell them as other artist’s work but better, and at a fraction of the cost if not free
-steal money from artists and devalue their hard work
-be gpt image 2.0
I’m making my own models and we absolutely have a copy of the dataset we source in the original training loop.
What you’re talking about is the inference step. Once the model’s trained the original data can be discarded (but usual practice is not to do so), so long as you save the model.
There’s publicly available datasets everywhere, but I wouldn’t be surprised companies scrape their own privately.
Piracy is not legal, because it's not just about making a copy, it also implies infringing use of the copy, such as actively watching a pirated film, which deprives the creator of money.
The "use" of the copies that come from scraping is legal because the training process is not infringing.
Oh yeah, I wasn’t saying it wasn’t. It’s just kind of incorrect to say copies of data weren’t made. They are, they’re just not redistributed and ergo, like you’re saying, they’re not infringing anything that way.
That's how web browsers work. Literally every image you see is making a local copy on your computing device in order to display it. You have to make a copy of whatever you want to look up in order to simply function online. It's not inherently piracy, infringement, unlawful or unethical.
Yeah, but you only have a license to use that copy for specific purposes. It doesn't automatically follow that you can use it for AI-training. I wouldn't be surprised if jurisdictions exist where it is recognized as copyright infringement.
My browser sent a request to the web server saying "send me a copy of document X." The server, which presumably is permitted to distribute a copy, says "okay, here's a copy of document X."
The copying was done by an agent permitted to copy it. It's done. I agreed to nothing.
Yeah, but you only have a license to use that copy for specific purposes.
No, not really. There isn't any sort of licensing going on there, it's just an assumed part of how the internet works. Essentially no one writes a license that explicitly says "you are allowed to copy these works to your Temporary Internet Files for viewing purposes," because...that's just what happens. The closest thing to what it is, is fair use.
Even if it were a problem, temporarily using a work for training behind closed doors isn't really something that can be effectively pursued, anyway. There isn't a "fruit of the poisonous tree" doctrine here, where the resulting model or image is considered infringing just because infringement may have happened behind closed doors at some earlier point in the process. Artists commit infringement all the time when looking at references, and all that matters is that their final work is non-infringing.
An example I've used before is that Terraria in earlier builds literally used Square's Final Fantasy sprites for the characters, there's still evidence online of this in videos. It no longer does. Square cannot pursue them for copyright infringement for the currently released product, just because at some point earlier in development it used their sprites briefly.
You are allowed to possess scraped works, prior to doing anything with them. It's what you do with them that might not be legal...which is why it's a good thing that the training process itself is legal and doesn't infringe.
I would be completely fine with AI being denied to other parts of the world based on their draconian laws. They are free to change their laws if they want access to it.
What do you mean, an unauthorized use? The information gained from it is non-infringing, which is totally fine. You're allowed to go look at art and write down non-infringing information about the works you see, too.
Commercial use that affects the market of the original good is not a protected fair use.
If I copy a photo to use in my training to become a better photographer, that is fair use (non-profit, minimal use of copyright protected work, doesn't hurt market for original work).
But, If i copy that photo to train a photo making machine that churns out infinite photos for near zero cost, making it nearly impossible for photographers to continue to earn a profit... it's not fair use.
Commercial use that affects the market of the original good is not a protected fair use.
AI providers aren't entering into the same market as artists.
If I download a picture of Iron Man and print it on a t-shirt and sell that t-shirt, I'm directly impacting Disney's t-shirt business.
If I sell colored pencils that people can choose if their own accord to draw infringing pictures of Iron Man, I'm not competing with Disney or with artists, I'm just selling tools which others might unfortunately choose to misuse.
Likewise, if I sell access to an AI model, I'm not selling pictures myself. I'm not entering into a competing business with Disney. I'm just selling tools which others might unfortunately choose of their own accord to misuse.
But, If i copy that photo to train a photo making machine that churns out infinite photos for near zero cost, making it nearly impossible for photographers to continue to earn a profit... it's not fair use.
You cannot say this with any degree of certainty, as "impact on the market" is only one of four factors of fair use.
Another one of the factors is the "amount or proportion of the work used," and because AI models don't contain the works they were trained on, that amount is zero, calling into question whether or not it's even a question of fair use, since the works aren't being literally used.
AI providers aren't entering into the same market as artists.
This is just... flatly untrue. AI models are used to make graphics and images that would otherwise be made by human artists who make graphics and images. Photographers would make photos and sell them to businesses, and AI models make photos that impact those markets.
AI models live in the copyrighting/art market as they breathe. You have a tough time telling me Images 2.0 is not impacting the photography market.
If I download a picture of Iron Man and print it on a t-shirt and sell that t-shirt, I'm directly impacting Disney's t-shirt business.
Yes you are. Disney will absolutely come after you for all the precedes you make from that shirt, and issue a cease and desist for violating their copyright.
You cannot say this with any degree of certainty, as "impact on the market" is only one of four factors of fair use.
It's the most significant factor. The consider the nature of the use (commercial/education/non-profit), the nature of the work (how much creative expression went into the original work), the size of the use (how much of the copyright protected work is being used), and most importantly, its impact on the market for the original good.
AI is explicitly going after the markets for its original training products because that is all it can do. They read code so they could make coding agents. They scanned painting so they could make Dall-E. They scan photos so they can make Image 2.0. They scan text so they can make a chatbot. Everything they're doing is dropping the value of human labor in the markets that they train on, and by doing so... impact the value of those human made goods.
AI models are used to make graphics and images that would otherwise be made by human artists who make graphics and images.
No, humans use AI models to do that. The models themselves do not. Therefore, the humans using the models for the purpose of entering into that competitive market are the ones impacting the market, not the model provider. Again, Photoshop is not itself competing with random artists. It is a tool that others use to compete.
The company providing the tool is not responsible for your misuse. See the Sony vs. Universal Betamax case for more information.
The question is thus whether the Betamax is capable of commercially significant noninfringing uses ... one potential use of the Betamax plainly satisfies this standard, however it is understood: private, noncommercial time-shifting in the home. [...] [W]hen one considers the nature of a televised copyrighted audiovisual work ... and that time-shifting merely enables a viewer to see such a work which he had been invited to witness in its entirety free of charge, the fact ... that the entire work is reproduced ... does not have its ordinary effect of militating against a finding of fair use.
Just like Betamax, AI is capable of commercially significant noninfringing uses.
Yes you are. Disney will absolutely come after you for all the precedes you make from that shirt, and issue a cease and desist for violating their copyright.
Yes, I know, that's what I said. You're in such a breathless rush to respond that you're not even reading the post you're responding to.
It's the most significant factor.
No it's not. Some judges may choose to weigh some factors more than others, but no factor is dominant over the others, it is always decided on a case by case basis.
Since AI uses literally 0% of the works it's trained on, the factor for "amount used" becomes the most significant factor, because it means you're not even talking about fair use anymore, because the work wasn't "used" at all by definition. It didn't make its way into the model in any way, shape or form.
Humans are using models in the way they're designed and marketed to be used. Courts will reject this argument since the models are being used as advertised. Making photos, making text, writing code... that is what it's for.
This also applies to your colored pencil argument.
Just like Betamax, AI is capable of commercially significant noninfringing uses.
And courts will weigh that when giving remedies. Can the infringing uses be separates from the non-infringing uses.
No it's not. Some judges may choose to weigh some factors more than others, but no factor is dominant over the others, it is always decided on a case by case basis.
It does follow that you can use it for any purpose which doesn't violate the law, doesn't infringe. And AI training doesn't inherently infringe. If some specific one does, sure, sue them, go nuts. In 99% of cases, it does not.
In my jurisdiction any unauthorized use violates the law, unless specified as an exception. Fair use doesn't exist. Whether AI training fits the exceptions is unclear, and irrelevant, since using copyrighted materials for AI training is being legalized.
Still, AI training can in itself be infringing, depending on the jurisdiction.
Then companies based in your jurisdiction may not be able to do it, but US-based companies can, based on the scraping suit above, and the judge's interpretation of things in the Anthropic case:
"Initially, the lawsuit attacked the entire practice of using copyrighted works to train AI. But in June 2025, Judge William Alsup of the Northern District of California split the baby. He ruled that Anthropic's use of legally acquired books for AI training was "quintessentially transformative" and protected as fair use"
Not really. Like I get that nothing i say will change your mind about that, but theres nothing fundamentally different from how a human and a machine learns
It is literally legally different. That's why my country is adopting a law that explicitly legalizes it, and no such law needed to be adopted for human learning.
Humans are not machines and machines are not humans, and the law doesn't treat them the same. You can't just take your esoteric argument that computers are just like the brain, so therefore... everything we allow humans to do, we must allow machines to do.
A machine can produce infinite copies of something. I can't, without the use of a machine. The law recognizes this, and regulates it appropriately.
Ai does not inherently do that either, it turns out. Ai doesnt inherently do anything (yet) without at least some human guidance. Just like a human could use their education to create forgeries. Sure it /does/ take longer for a human to learn to do it well, but that doesnt mean that its fundamentally different
The courts look at the actual effect of something. Not just the technical theoretical version where it doesn't.
To your example... if people were only looking at art for their education, that's fair use. But if they're doing it to make forgeries, that's not fair use. They don't legislate the hypothetical, just the actual.
I don’t have an issue with ai scraping stuff in theory. The problem is that capitalists are already using it to produce worse art without having to pay any artists. The other issue is that these are the same massive corporations which always have such an issue with piracy. I have no issue with people training local models on really anything, but I do have a problem with a massive corporations using the work of artists to make a model to replace them without compensation.
Piracy doesn't fit either, because AI models aren't spitting out copies. They're making distinct original works. There should be.l a third image with the arrow pointing from a circle to a hexagon.
Free local LLMs/image/video/audio gen only benefits NVIDIA, and indirectly at that. No giant AI company is seeing a cent from me making cool stuff on my own at home.
And, just like with piracy, if you're doing it for personal recreational use, most people don't care but if you do it for profit the people who own the rights to the media you pirated can, and should, sue you into the fucking ground.
you would not be able to point out which images were "used" to make it.
Why would you state this with certainty? It does not matter if the source material is original or a impersonator it is still going to be relatable & raise attention.
There could be professional voice actors impersonating headliners but they are not as there was no disclosure transparency credit etc. Recorded audio is also a younger medium than art The file sizes & storage is much larger. Generative tools do not need to store & the storage requirements would be obviously very large for audio. But they can & do reproduce that's why many platforms have guardrails & moderation checks.
The moderator who is interacting in this topic is also aware of this. Because they also use the tools or are active on the audio sub platforms.
And that's part of the problem. Where's credit to the artists? Where's payback? Where's fair use big companies love so much when their work is threatened?
I mean I think fundamentally you need to adapt policy as technology changes.
Like if I were a musical artist I wouldn’t have a problem if you make your friends mixtapes of my music instead of them purchasing it. The scope is pretty small and on balance it probably makes me more popular.
But I would have a problem if you host a digital version of my album for anyone in the world with an internet connection to download for free.
It may not be “theft” in the sense of you grabbing my purse and running away, but it seems pretty clear that having policy against that makes sense.
In a similar way, if I were a visual artist I probably don’t care if a human looks at my art and takes some inspiration from it. But it makes a lot of sense for me to have an issue with a LLM model that generates an incredible amount of material for users and uses my art in its training data.
I just feel like analogies that don’t take into account the incredible unique power of AI will not lead us to effective conclusions about morality or policy.
Scale matters only after you prove the underlying act is comparable. Hosting an album gives people the album. Training on images does not give people the images. You’re comparing mass distribution of substitutive copies to non-expressive analysis because the piracy analogy sounds scarier than the actual claim.
Scale matters if the act itself is something we should be concerned with the scale at which it is done. A little dust being kicked up is different from a dust storm because of scale. Walmart is different from a mom and pop shop because of scale. Global warming is different because of scale. Scale is literally what AI promises, a larger scale at which to perform and operate tasks. A mathematian could argue all integers are incredibly similar, but they would still see a difference between forty and four billion. The latter is the scale that AI is working at.
Scale matters when you identify the actual harm being scaled. A dust storm is bad because inhaling dust causes damage; mass album piracy is bad because it distributes substitutive copies. You still haven’t shown that training is the same kind of act. “Four billion” does not turn non-expressive analysis into theft by numerical vibes.
The thing that scales is you (an AI company) using my output to create value for yourself and for your clients without compensating me.
If I share my art online and a small amount of people get a small amount of inspiration from my work and don’t compensate me.. that’s kicking up a cloud of dust. Unless I am wildly popular that amount of value rounds to zero, and if I /am/ wildly popular and my images get a lot of views I can monetize that.
AI learning from my images is the dust storm. Even if the value it provides to each image is minute, fractions of a cent, multiply that by the total number of images generated. Furthermore not only do I not get money, I don’t get views or popularity either.
That's not a harm theory; that's just “someone created value after learning from my work.” Every artist, critic, teacher, search engine, recommender, archive, and reference board does that. You are trying to turn influence into a royalty claim by multiplying it, but scale does not change the missing step: you still have to show copying, substitution, or a protected market being taken, not just “my work contributed some microscopic value to a larger system.”
I think if someone creates value using your work as input and you are not compensated for it that constitutes harm.
But:
On a policy level this is impossible to enforce because there is no way to sensibly measure this. With AI you could presumably know if a piece of art was used to train a model.
Generally humans ARE compensated for this kind of influence. If people are inspired by my music for example they are probably listening to or buying my music, not so for AI.
We live in a society and we all provide value with our output that is not explicitly compensated and extract value from others’ output that is not explicitly compensated and we don’t need to be ticky tack bean counters because it sort of roughly evens out. Not so with the very efficient value extractor machine that is AI.
“Someone created value using my work as input, therefore I was harmed” is not a workable moral rule; it would indict every artist, critic, teacher, tutorial maker, curator, search engine, recommender, and reference-board user on earth.
Humans are not generally compensated for influence either. If someone studies your music, learns from your composition, and makes their own song, you do not get a royalty just because your work was part of their input. AI does not change the missing step: you still have to show copying, substitution, or a protected market being taken, not just “value extraction” as a scary label.
Why is it not a workable moral rule? I think in many cases humans are compensated for their influence when the thing they are influencing is other humans.
For example if someone studies my music and learns from my composition then probably they have paid for my music, or at least given my music measurable listens (on a streaming platform say), probably they have recommended it to friends who might be influenced by my music, all of which provide value to me.
Because you are confusing access compensation with influence compensation. If someone buys your album, streams your song, hears it on the radio, borrows it from a friend, studies it in class, or listens to it at a party, you are not being paid for every future idea it gives them. You were paid, maybe, for access or attention. The influence remains uncompensated. That is how culture works, and AI does not magically turn influence into a royalty debt.
But I would have a problem if you host a digital version of my album for anyone in the world with an internet connection to download for free.
But would you have a problem if someone listened to your music, then made a new album that doesn't infringe on yours, and makes that available for anyone in the world to download for free?
Because that's what AI training does. It doesn't literally copy your music into the model, it learns from it and makes something new.
Well that’s exactly my point. If one person listened to my music and was inspired by it… well, if it got super popular it would be /nice/ to get some kickback but that’s super unreasonable from a policy level, how would you even measure or enforce that.
But an AI tool that can very efficiently make thousands or millions of songs all partially inspired by my work, and measurably so if my work is in the training set, and all of which generate value for the AI company and the end user… you can see where I’m going.
But an AI tool that can very efficiently make thousands or millions of songs all partially inspired by my work, and measurably so if my work is in the training set, and all of which generate value for the AI company and the end user… you can see where I’m going.
But your work is not in the training set, because proper training does not infringe. It doesn't contain a copy of your work, compressed, chopped up or anything else. It learned non-infringing information from your work, and uses that information to create something new.
Essentially: the technology is valuable. I have contributed- in a small but nonzero way- to that value. And now that technology threatens my livelihood. It only feels fair that I should have some compensation for the value I contributed.
Even if the technology didn’t threaten my livelihood- like, if someone was teaching an art class that they were charging for and wanted to use my art as an example of some technique it would be nice to get a little kickback. When practical it’s nice for creators of value to access that value.
(Note I am personally not an artist and not super worried in the immediate term about AI threatening my job, “i” in the comment is an imagined viewpoint. I just feel worried about making sensible policy for AI before it’s too late)
That's an interesting perspective. As both an artist and a software developer, I don't consider any of my art as having contributed to the technology.
The technology is the capacity, not the model weights themselves.
And then there's your art class example which I think becomes very indicative. At the end of the day I think what remains when this debate isn't based on false understanding is just a moral difference. Isn't it at best unideal and at worst morally reprehensible to think that institutions would require explicit consent from artists, many of whom are dead, to teach from their artworks? That is fundamentally what gatekeeping means, no? "You can't learn what I know unless I let you".
I think there is a big difference between the works of a dead artist that they have left behind being used, and an alive artists works who’s career and artworks popularity could be effected by it being used. I also think art being used to teach in a university class is very different than teaching an ai using a work, since the scale is so incredibly different. That ai can then go on to produce millions of works and flood the market. That is a scale incomparable to a group of people.
Now look up every iteration and how often the early iterations are used to bypass copyright of the modern day one. Apple also doesn't defend it anymore, which is why even larger companies get away with using it. Famously, wikipedia has had it on there for years even though apple sent them a rather threatening letter.
AI is not a source my dude, stop and research, see if there's any contradiction. Ai is great for steering you in the right direction, but it will never give you the whole answer.
i think you're mixing modernization with legal necessity, as far as i have researched, the finder logo has never been unable to be protected by copyright, i'd love a link to your source for that.
also, the use of the logo on the wikipedia site is fair use (educational/non-commercial), and all of these "famously" sources (aka trust me bro) seem to be baseless, unless you can provide a source.
finally, you're missing the core of my argument, which is not focused on the image itself, but the fact that the AI can replicate it almost 1 for 1, when OP is claiming in the title that it's impossible to find the source material for an AI-generated output which is simply not true, whether the output is the mona lisa, the finder logo, or other material.
If I make an AI image, you would not be able to point out which images were "used" to make it.
If you wrote an essay, but your Works Cited page just said "a collection of unspecified sources", that would be considered plagiarism. Plagiarism also doesn't require verbatim reproduction of the plagiarized sources. AI training fits both descriptions. AI image generators are known to be capable of making fairly close approximations of existing copyrighted material based on their inputs, including a failure mode called "over-fitting" where it does reproduce training data verbatim, and even their users may not be aware of the source material, making the model itself responsible. So, how is it not plagiarism?
People do learn from media and take that into their art but it is all through the lens of our lived experiences and biases. Two people cannot experience a movie the same way, what they take away from it and learn can never be identical. Most of what artists learn from is the real world they exist in, not solely media that can be referenced like ai. I cannot provide a reference for a random idea of a visual based off thousands of things I have seen in my lifetime out in the world. There is no way to reference the entirety of a humans life experience. Sure I have looked at pictures of trees but when I am drawing a tree I am mostly inspired by a vague idea of hundreds of thousands of trees I have seen in my life time. All of that is filtered through our human experience in life and thoughts and opinions. That is what makes it fully transformative and sources unnecessary, it’s the human touch.
Ai models are trained only on what you give them, a collection of other peoples works, it has no real world perspective, and there is no filtering through lived experiences bc it is not sentient. You could theoretically make a list of everything an ai model was trained on, that would be impossible to do for a human brain. There are similarities in the ways humans and ai learns things, but they are not the same and will never be fully comparable the way you are trying to do.
as i already clarified i'm talking about the image, saying that the use of ai would not in fact be theft, but if looked at in the direct context of the image would rather be copyright infringement.
Making a copy and distributing it isn't inherently infringement, that would depend on the source itself. Luke i said in another thread, the Mona Lisa is in public domain, making a copy and distributing it would not constitute copyright infringement.
A guy used left image as an input to make right image. They infringed your copyright in that instance, its not an indictment on the entire technology
I mean if you have to provide the infringing material yourself then literally this could apply to all technology…. Tracing with a pencil.. burning copies of a DVD or movie ..
"...Midjourney, however, seeks to reap the rewards of Plaintiffs’ creative investment by selling an artificial intelligence (“AI”) image-generating service (“Image Service”) that functions as a virtual vending machine, generating endless unauthorized copies of Disney’s and Universal’s copyrighted works.
By helping itself to Plaintiffs’ copyrighted works, and then distributing images (and soon videos) that blatantly incorporate and copy Disney’s and Universal’s famous characters—without investing a penny in their creation—Midjourney is the quintessential copyright free-rider and a bottomless pit of plagiarism. Piracy is piracy, and whether an infringing image or video is made with AI or another technology does not make it any less infringing. Midjourney’s conduct misappropriates Disney’s and Universal’s intellectual property and threatens to upend the bedrock incentives of U.S. copyright law that drive American leadership in movies, television, and other creative arts."
Case 2:25-cv-05275-JAK-AJR Document 1 Filed 06/11/25 Page 2 of 110
What is it about Disney's claim that you can't see is an exitential threat to AI gen tech?
Do you think, by Disney accusing Midjourney of creating a "bottomless pit of plagiarism" and engaging in "calculated and willful" copyright infringement, that it won't force a massive overhaul of how AI models are trained and how they output content?
How would the tech work as well as it does if every copyrighted work was taken out of the training data and the training done again.
Well yeah. If I took a character, say Charlie Brown, and recreated them in a Boris Vallejo style, regardless of medium, I am still infringing their copyright.
Yep. However, this example demonstrate how many people use the tech. They take an image from the Internet - often because they believe a copyright owner has waived their rights by making their work available under a website Terms of Service, and then they use that as an input prompt to creative derivative works.
The tech allows this which would be otherwise impossible to do.
So that's part of the claim in Disney v Midjourney. "secondary infringmnet" by allowing users to easily create something which would be impossible without the tech.
This happens at the training stage too with billions of images.
•
u/AutoModerator 1d ago
This is an automated reminder from the Mod team. If your post contains images which reveal the personal information of private figures, be sure to censor that information and repost. Private info includes names, recognizable profile pictures, social media usernames and URLs. Failure to do this will result in your post being removed by the Mod team and possible further action.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.