AI has changed capitalism's tune on content theft
Capitalists have found a way to extract vast profits out of content theft and plagiarism, so they've completely changed their tune on intellectual property and content theft
After an extraordinary back and forth between the House of Commons and the House of Lords, the government has finally defeated Peers’ efforts to protect Britain’s £124 billion creative industries from capitalists who want to steal their content and feed it into their AI automated plagiarism machines.
The amendment that the Lords kept trying to add to the Data (Use and Access) Bill was merely intended to make AI companies declare when they’ve used other people’s copyrighted material as training data for their algorithms.
The amendment was supported by musicians, authors, and artists, but Starmer’s government eventually got their own way after repeatedly stripping it out again.
Britain’s creative sector is one of the few areas where the UK still undoubtedly punches above its weight on the world stage, and it’s worth "£124 billion per year to the economy.
It’s a huge gamble to put that at risk by allowing AI companies to train their algorithms on copyrighted content, without even telling the people who’s work they’re ripping off that they’re doing it.
The government’s argument seems to be that Britain needs to avoid regulating AI use, because British AI development will be left behind if it doesn’t have the same kind of lawless wild-west environment as AI in the United States and China.
We know that Meta has been accused of torrenting vast amounts of books and research papers from the huge pirate database LibraryGenesis (LibGen) to train the Meta AI that now appears all over products like WhatsApp, Facebook, and Instagram.
Facebook staff argued in favour of looting pirated material to build their language model because they thought it would be too expensive and time consuming to properly licence individual books and papers.
In fact court filings reveal that one Meta employee stated that "the problem is that […] if we license one single book, we won’t be able to lean into fair use strategy".
The argument seems to be that they had to steal all of their training data, because if they licenced the use of just one thing, they’d struggle to argue the case that they couldn’t have also properly licenced all of the other copyrighted material they’re accused of torrenting and feeding into their algorithm.
With AI developers looting copyrighted material as training data, it seems logical to bring in a statutory obligation for them to report any use of copyrighted materials, but Starmer’s government have come down firmly on the side of the copyright looters.
It’s a far cry from the anti-piracy propaganda of the last few decades, when we were told that copying a tape for a mate was "killing the music industry"; that ripping CDs and DVDs were appalling crimes; and that using file sharing services like Napster and The Pirate Bay represented existential threats to the creative industries.
It’s also interesting to contrast the kid glove treatment AI companies are getting over mass copyright theft to train their plagiarism machines, with the legal pursuits and intimidation aimed at the likes of Aaron Swartz, Per Svartholm Warg, and Kim Dotcom for stuff like making academic texts freely accessible, and operating file sharing services.
Aaron Swartz ended up committing suicide over the legal cases against him for downloading academic documents with the intention of making them freely accessible.
It’s hardly surprising that capitalist-bankrolled governments have a completely different attitude to copyright protection now that capitalists want a lawless environment in which they can profit by feeding stolen data into their language models with impunity.
Governments were always implacably opposed to free use and file sharing because those things stood to benefit ordinary people, and they’re vehemently opposed to preventing AI companies from having to admit what copyrighted materials they used, because that might constrain capitalist profit-making in the AI sector.
The problems with this wild west attitude towards AI development are already becoming clear though, and it goes far beyond the threat these plagiarism machines represent to the creative industries.
One of the biggest problems is that post-AI datasets are getting ever more polluted with AI slop, which means AI is increasingly training itself on AI outputs rather than legitimate human-generated content.
It’s going to be increasingly difficult to train AI language models to sound human, when an ever growing proportion of what they’re trained on has been generated by sloppy first generation AI.
Several AI experts are speaking out about this problem, going as far as claiming that the AI industry’s bitter opposition to any kind of regulation, oversight, or agreed standards may make them "their own worst enemy" in the long-run.
That’s before we even get into other areas of concern like:
• The growing plague of AI cheating in schools, universities, and academia
• AI churning out defamatory lies about people
• AI faking data and hallucinating made-up references
• AI promoting bonkers extreme-right conspiracy theories
• Court cases collapsing because of AI-generated references to made up case law
Keir Starmer and his inner circle have fully bought into the AI-peddlers fairy story that AI will rescue the UK economy from the doldrums, as long as they’re allowed to operate with impunity; nick everyone’s creative content without even telling them; and infest everything with AI slop as quickly and carelessly as possible.
Starmer is betting the house on AI magic beans solving Britain’s economic woes, and if this gamble comes at the price of ruining everything with AI slop, and undermining of Britain’s valuable creative industries, that’s a price he’s clearly willing to pay for his gamble.
As a disabled and chronically ill artist who is unable to work at any sort of commercially viable level in literally any job, it's darkly funny to me that they're obsessed with taking away the benefits I need to survive (in poverty) while simultaneously happily cheering on the death of the only industry I'm able to make a few pence in occasionally by selling the artwork that takes me months to complete, but is now much harder to sell because you can get a shit version for free or not much (if you're not counting the environmental cost) from the AI plagiarism machine, and the shitbags who use it to undercut real artists.
People stopped understanding what meaning is and how to detect meaning and how to tell apart meaningful content from demagogy. I see the problem in people losing ability to distinguish machine content from human creation. They themselves think like AI.