OpenAI declares AI race “over” if training on copyrighted works isn’t fair use
OpenAI declares AI race “over” if training on copyrighted works isn’t fair use

National security hinges on unfettered access to AI training data, OpenAI says.

OpenAI declares AI race “over” if training on copyrighted works isn’t fair use
National security hinges on unfettered access to AI training data, OpenAI says.
If I had to pay tuition for education (buying text books, pay for classes and stuff), then you have to pay me to train your stupid AI using my materials.
As an artist, kindly get fucked ass hole. I'd like compensation for all the work of mine you stole.
"How are we supposed to win the race if we can't cheat?!"
“The plagiarism machine will break without more things to plagiarize.”
Okay.
It was fun while it lasted.
For someone.
I presume.
Good, end this AI bullshit, it has little upsides and a metric fuckton of downsides for the common man
You don't want to all become literally socially and mentally retarded together but apart?
What a giant load of crap.
Oh no anyway.jpg
This is exactly what social media companies have been doing for a while (it’s free, yes) they use your data to train their algorithms to squeeze more money out of people. They get a tangible and monetary benefit from our collective data. These AI companies want to train their AI on our hard work and then get monetary benefit off of it. How is this not seen as theft or even if they are not doing it just yet…how is it not seen as an attempt at theft?
How come people (not the tech savvy) are unable to see how they are being exploited? These companies are not currently working towards any UBI bills or policies in governments that I am aware of. Since they want to take our work, and use it to get rich and their investors rich why do they think they are justified in using people’s work? It just seems so slime-y.
Capital calls its own theft "innovation" and that of the individual "crime".
I'll take him seriously if & when OpenAI lives up to its name.
So, did we win?
I am good with that.
dont threaten me with a good time
No, actually they've just finally admitted that they can't improve them any further because there's not enough training data in existence to squeeze any more demonizing returns out of.
Over in the US, that's giving China the advantage in AI development. Won't happen.
It's it's like USA adopting China's IP laws.
Oops, oh well. I very much hope it's over, asshole.
The only way this would be ok is if openai was actually open. make the entire damn thing free and open source, and most of the complaints will go away.
Truly open is the only way LLMs make sense.
They're using us and our content openly. The relationship should be reciprocal. Now, they need to somehow keep the servers running.
Perhaps a SETI like model?
over it is then. Buh bye!
But I can't pirate copyrighted materials to "train" my own real intelligence.
That's because the elites don't want you to think for yourself, and instead are designing tools that will tell you what to think.
Now you get why we were all told to hate AI. It's a patriot act for copywrite and IP laws. We should be able too. But that isn't where our discussions were steered was it
True!
Business that stole everyone's information to train a model complains that businesses can steal information to train models.
Yeah I'll pour one out for folks who promised to open-source their model and then backed out the moment the money appeared... Wankers.
looks good
"We can't succeed without breaking the law. We can't succeed without operating unethically."
I'm so sick of this bullshit. They pretend to love a free market until it's not in their favor and then they ask us to bend over backwards for them.
Too many people think they're superior. Which is ironic, because they're also the ones asking for handouts and rule bending. If you were superior, you wouldn't need all the unethical things that you're asking for.
Sounds like you are describing the orange baboon in the white house.
these kinds of asshats are all the same. Only difference is the size of the hat.
Copyrights should have never been extended longer than 5 years in the first place, either remove draconian copyright laws or outlaw LLM style models using copyrighted material, corpos can't have both.
Send This comment To the top
I think copyright lasting 20 years or so is not unreasonable in our current society. I'd obviously love to live in a society where we could get away with lower. As a compromise, I'd like to see compulsory licensing applied to all written work. (E.g., after n years, anyone can use it if they pay royalties and you can't stop them; the amount of royalties gradually decreases until it's in the public domain.)
Bro, what? Some books take more than 5 years to write and you want their authors to only have authorship of it for 5 years? Wtf. I have published books that are a dozen years old and I'm in my mid-30s. This is an insane take.
The one I thought was a good compromise was 14 years, with the option to file again for a single renewal for a second 14 years. That was the basic system in the US for quite a while, and it has the benefit of being a good fit for the human life span--it means that the stuff that was popular with our parents when we were kids, i.e. the cultural milieu in which we were raised, would be public domain by the time we were adults, and we'd be free to remix it and revisit it. It also covers the vast majority of the sales lifetime of a work, and makes preservation and archiving more generally feasible.
5 years may be an overcorrection, but I think very limited terms like that are closer to the right solution than our current system is.
Thanks that's very insightful and I'll amend my position to 15 years 5 may be just a little zealous. 100 year US copyrights have been choking innovation due to things like Disney led trade group lobbyists, 15 years would be a huge boost to many creators being able to leverage more IPs and advancements being held in limbo unused or poorly used by corpo entities.
I agree that copyright is far too long, but at 5 years there's hardly incentive to produce. You could write a novel and have it only starting to get popular after 5 years.
I think 5 years is a bit short.
the issue is that foreign companies aren't subject to US copyright law, so if we hobble US AI companies, our country loses the AI war
I get that AI seems unfair, but there isn't really a way to prevent AI scraping (domestic and foreign) aside from removing all public content on the internet
Apparantly their trying to get Deepseek banned again, really doesn't like competition this guy.
So pirating full works for commercial use suddenly is "fair use", or what? Lets see what e.g. Disney says about this.
Slave owners might go broke after abolition? 😂
I'm going to have to remember this
Where are the copyright lawsuits by Nintendo and Disney when you need them lol
Training that AI is absolutely fair use.
Selling that AI service that was trained on copyrighted material is absolutely not fair use.
Agreed... although I would go a step further and say distributing the LLM model or the results of use (even if done without cost) is not fair use, as the training materials weren't licensed.
Ultimatelly it's "Doing Research that advances knowledge for everybody" that should be allowed free use of copyrighted materials, whils activities for direct or indirect commercial gains (included Research whose results are Patented and then licensed for a fee) should not, IMHO.
Good.
Fuck Sam Altman's greed. Pay the fucking artists you're robbing.
Fine by me. Can it be over today?
I'll get the champagne for us and tissues for Sam.
Unfortunately, the tissues have a 1000% tarrif. Perhaps sandpaper will do?
Shit, save your $$$ and get some GPUs since the market would crash.
I'll bring the meth
What if we had taken the billions of dollars invested in AI and invested that into public education instead?
Imagine the return on investment of the information being used to train actual humans who can reason and don’t lie 60% of the time instead of using it to train a computer that is useless more than it is useful.
But you have to pay humans, and give them bathroom breaks, and allow them time off work to spend with their loved ones. Where's the profit in that? Surely it's more clever and efficient to shovel time and money into replacing something that will never be able to practically develop beyond current human understanding. After all, we're living in the golden age of humanity and history has ended! No new knowledge will ever be made so let's just make machines that regurgitate our infallible and complete knowledge.
So pirating full works suddenly is fair use, or what?
Only if you're doing it to learn, I guess
Wait until all those expensive scientific journals hear about this
If I'm using "AI" to generate subtitles for the "community" is ok if i have a large "datastore" of "licensable media" stored locally to work off of right?
God forbid you offer to PAY for access to works that people create like everyone else has to. University students have to pay out the nose for their books that they "train" on, why can't billion dollar AI companies?
Come on guys, his company is only worth $157 billion.
Of course he can't pay for content he needs for his automated bullshit machine. He's not made of money!
If everyone can 'train' themselves on copyrighted works, then I say "fair game.''
Otherwise, get fucked.
If your business model only works if you break the Law, that mean's you're just another Organised Crime group.
Organized crime exists to make money; the way OpenAI is burning through it, they're more Disorganized Crime
Gentlemen, this is democracy manifest!
What is the charge, officer? Eating a meal? A succulent Chinese meal?
If artificial intelligence can be trained on stolen information, then so should be "natural" intelligence.
Oh, wait. One is owned by oligarchs raking in billions, the other just serves the plebs.
couldnt' have said it better...the irony...
I mean, if they are allowed to go forward then we should be allowed to freely pirate as well.
In the end, we're just training some non-artifical intelligence.
Yeah, you can train your own neural network on pirated content, all right, but you better not enjoy that content at the same time or have any feelings while watching it, because that's not covered by "training".
Don't worry: the law will be very carefully crafted so that it will be legal only if they do it, not us.
Come on bro, let us pirate bro, just one more ngram of books bro
I'm fine with this. "We can't succeed without breaking the law" isn't much of an argument.
Do I think the current copyright laws around the world are fine? No, far from it.
But why do they merit an exception to the rules that will make them billions, but the rest of us can be prosecuted in severe and dramatic fashion for much less. Try letting the RIAA know you have a song you've downloaded on your PC that you didn't pay for - tell them it's for "research and training purposes", just like AI uses stuff it didn't pay for - and see what I mean by severe and dramatic.
It should not be one rule for the rich guys to get even richer and the rest of us can eat dirt.
Figure out how to fix the laws in a way that they're fair for everyone, including figuring out a way to compensate the people whose IP you've been stealing.
Until then, deal with the same legal landscape as everyone else. Boo hoo
I also think it's really rich that at the same time they're whining about copyright they're trying to go private. I feel like the 'Open' part of OpenAI is the only thing that could possibly begin to offset their rampant theft and even then they're not nearly open enough.
They are not releasing anything of value in open source recently.
Sam altman said they were on the wrong side of history about this when deepseek released.
They are not open anymore I want that to be clear. They decided to stop releasing open source because 💵💵💵💵💵💵💵💵.
So yeah I can have huge fines for downloading copyrighted material where I live, and they get to make money out of that same material without even releasing anything open source? Fuck no.
🌏👨🚀🔫👨🚀🌌
Suddenly millions of people are downloading to "train their AI models".
That sounds like a you problem.
"Our business is so bad and barely viable that it can only survive if you allow us to be overtly unethical", great pitch guys.
I mean that's like arguing "our economy is based on slave plantations! If you abolish the practice, you'll destroy our nation!"
Good point. I've never seen it framed this way before. Poignant.
Thanks, heh, I just came back to look at what I'd written again, as it was 6am when I posted that, and sometimes I say some stupid shit when I'm still sleepy. Nice to know that I wasn't spouting nonsense.
Perhaps this is just a problem with the way the model works. Always requiring new data and unable to use current data, to ponder and expand upon while making new connections about ideas that influenced the author… LLM’s are a smoke and mirrors show, not a real intelligence.
They do seem fundamentally limited somehow. With all the bazillion watts they are cheap imitation at best compared to mere 20 Watts of human brain
or it might be playing dumb...
For Sam:
That's a good litmus test. If asking/paying artists to train your AI destroys your business model, maybe you're the arsehole. ;)
Not only that, but their business model doesn't hold up if they were required to provide their model weights for free because the material that went into it was "free".
There's also an argument that if the business was that reliant on free things to start with, then it shouldn't be a business.
No-one would bat their eyes if the CEO of a real estate company was sobbing that it's the end of the rental market, because the company is no longer allowed to get houses for free.
even the top phds can learn things off the amount of books that openai could easily purchase, assuming they can convince a judge that if the works aren't pirated the "learning" is fair use. however, they're all pirating and then regurgitating the works which wouldn't really be legal even if a human did it.
also, they can't really say how they need fair use and open standards and shit and in the next breathe be begging trump to ban chinese models. the cool thing about allowing china to have global influence is that they will start to respect IP more... or the US can just copy their shit until they do.
imo that would have been the play against tik tok etc. just straight up we will not protect the IP of your company (as in technical IP not logo, etc.) until you do the same. even if it never happens, we could at least have a direct tik tok knock off and it could "compete" for american eyes rather than some blanket ban bullshit.
Interesting copyright question: if I own a copy of a book, can I feed it to a local AI installation for personal use?
Can a library train a local AI installation on everything it has and then allow use of that on their library computers? <— this one could breathe new life into libraries
First off, I'm by far no lawyer, but it was covered in a couple classes.
According to law as I know it, question 1 yes if there is no encryption, and question 2 no.
In reality, if you keep it for personal use, artists don't care. A library however, isn't personal use and they have to jump through more hoops than a circus especially when it comes to digital media.
But you raise a great point! I'd love to see a law library train AI for in-house use and test the system!
This particular vein of "pro-copyright" thought continuously baffles me. Copyright has not, was not intended to, and does not currently, pay artists.
Its totally valid to hate these AI companies. But its absolutely just industry propaganda to think that copyright was protecting your data on your behalf
Copyright has not, was not intended to, and does not currently, pay artists.
You are correct, copyright is ownership, not income. I own the copyright for all my work (but not work for hire) and what I do with it is my discretion.
What is income, is the content I sell for the price acceptable to the buyer. Copyright (as originally conceived) is my protection so someone doesn't take my work and use it to undermine my skillset. One of the reasons why penalties for copyright infringement don't need actual damages and why Facebook (and other AI companies) are starting to sweat bullets and hire lawyers.
That said, as a creative who relied on artistic income and pays other creatives appropriately, modern copyright law is far, far overreaching and in need of major overhaul. Gatekeeping was never the intent of early copyright and can fuck right off; if I paid for it, they don't get to say no.
Copyright has not, was not intended to, and does not currently, pay artists.
Wrong in all points.
Copyright has paid artists (though maybe not enough). Copyright was intended to do that (though maybe not that alone). Copyright does currently pay artists (maybe not in your country, I don't know that).
No, it means that copyrights should not exist in the first place.
So Deepmind is good to train on your models then right?
Oh, so now you're just going to surrender our precious natural resources to the Imperialist Chinese?!
Guys, I think we've got a Wumao over here. Someone get what's left of the FBI to arrest him and show his ass the fucking door.
Sounds like another way of saying "there actually isn't a profitable business in this."
But since we live in crazy world, once he gets his exemption to copyright laws for AI, someone needs to come up with a good self hosted AI toolset that makes it legal for the average person to pirate stuff at scale as well.
I mean, pirating media at scale for your own consumption can be considered "training of a neural network" as well..
First step, be a business. Second step, accept Trump's dick in your ass. Congratulations, here's your "get out of jail free" card.
Also, pirating media at scale isn't that hard to do right now anyway lol
Why does Sam keep threatening us with a good time?
I hope generative AI obliterates copyright. I hope that its destruction is so thorough that we either forget it ever existed or we talk about it in disgust as something that only existed in stupider times.
Thing is that copywrite did serve a purpose and was for like 20 years before disney got it extended to the nth degree. The idea was the authors had a chance to make money but were expected to be prolific enough to have more writings by the time 20 years was over. I would like to see with patents that once you get one you have a limited time to go to market. Maybe 10 years and if you product is ever not available for purchase (at a cost equivalent to the average cost accounted for inflation or something) you lose the patent so others can produce it. So like stop making an attachment for a product and now anyone can.
"Thing is, land ownership also served a purpose before lord's/landlord's/capitalists decided to expand it to the point of controlling and dictating the lives of serfs/renters/workers. "
Creation's are not that of only the individual creator, they come from a common progress, culture, and history. When individual creator's copyright their works and their works become a major part of common culture they slice up culture for themselves, dictating how it may be used against the wishes of the masses. Desiring this makes them unworthy of having any cultural control IMO. They become just as much of an authoritarian as a lord, landlord, or capitalist.
In fact, I'd go so far as to say that copyright also harms individual creators once culture has been carved up: Producing brand new stories inevitably are in some way derivative of previous existing works so because they are locked out of the existing IP unless they sign a deal with the devil they're usually doomed to failure due to no ability to have a grip on cultural relevance.
Now, desiring the ability to make a living being an individual creator? That's completely reasonable. Copyright is not the solution however.
The problem with these systems is that the more they are bureaucratized and legalized, the more publishing houses and attorney's offices will ultimately dictate the flow of lending and revenue. Ideally, copywrite is as straighforward as submitting a copy of your book to the Library of Congress and getting a big "Don't plagiarize this" stamp on it, such that works can't be lifted straight from one author by another. But because there's all sorts of shades of gray - were Dan Brown and JK Rowling ripping off the core conceits of their works, or were religious murder thrillers and YA wizard high school books simply done to death by the time they went mainstream? - a lot of what constitutes plagarism really boils down to whether or not you can afford extensive litigation.
And that's before you get into the industrialization of ghostwriters that end up supporting "prolific" writers like Danielle Steele or Brian Sanderson or R.L. Stein. There's no real legal protection for staff writers, editors, and the like. The closest we've got is the WGA, and that's more exclusive to Hollywood.
Interesting take. I'm not opposed, but I feel like the necessary reverse engineering skill base won't ramp up enough to deal with SAS and holomorphic encryption. So, in a world without copyright, you might be able to analog hole whatever non-interactibe media you want, but software piracy will be rendered impossible at the end of the escalation of hostilities.
Copyright is an unnatural, authoritarian-imposed monopoly. I doubt it will last forever.
I find that very unlikely to happen. If AI is accepted as fair use by the legal system, then that means they have a motive to keep copyright as restrictive as possible; it protects their work but allows them to use every one else's. If you hate copyright law (and you should) AI is probably your enemy, not your ally.
I suspect your assessment is at best subconsciously biased and at worst in bad faith. You'll need to elaborate on the mechanism of how they'd "keep copyright as restrictive as possible" in a world where it is not possible to copyright AI generated works.
It's so wild how laws just have no idea what to do with you if you just add one layer of proxy. "Nooo I'm not stealing and plagerizing, it's the AI doing it!"
Look we may have driven Aaron Swartz to suicide for doing basically the same thing on a smaller scale, but dammit we are getting very rich of this. And, if we are getting rich, then it is okay to break the law while actively fucking over actually creative people. Trust us. We are tech bros and we know what is best for you is for us to become incredibly rich and out of touch. You need us.
In case anyone is unfamiliar, Aaron Swartz downloaded a bunch of academic journals from JSTOR. This wasn't for training AI, though. Swartz was an advocate for open access to scientific knowledge. Many papers are "open access" and yet are not readily available to the public.
Much of what he downloaded was open-access, and he had legitimate access to the system via his university affiliation. The entire case was a sham. They charged him with wire fraud, unauthorized access to a computer system, breaking and entering, and a host of other trumped-up charges, because he...opened an unlocked closet door and used an ethernet jack from there. The fucking Secret Service was involved.
https://en.wikipedia.org/wiki/Aaron_Swartz#Arrest_and_prosecution
The federal prosecution involved what was characterized by numerous critics (such as former Nixon White House counsel John Dean) as an "overcharging" 13-count indictment and "overzealous", "Nixonian" prosecution for alleged computer crimes, brought by then U.S. Attorney for Massachusetts Carmen Ortiz.
Nothing Swartz did is anywhere close to the abuse by OpenAI, Meta, etc., who openly admit they pirated all their shit.
You're correct that their piracy was on a much more egregious scale than what Aaron did, but they don't openly admit to their piracy. Meta just argued that it isn't piracy because they didn't seed.
Edit: to be clear. I don't think that Aaron Swartz did anything wrong. Unlike the chatGPT, meta, etc.
If training an ai on copyrighted material is fair use, then piracy is archiving
I'm fine with that haha