Artists are attempting to battle AI — with limited success
The music industry is fighting on platforms, through the courts and with legislators in a bid to prevent the theft and misuse of art from generative AI — but it remains an uphill battle.
Sony Music said recently it has already demanded that 75,000 deepfakes — simulated images, tunes or videos that can easily be mistaken for real — be rooted out, a figure reflecting the magnitude of the issue.
The information security company Pindrop says AI-generated music has “telltale signs” and is easy to detect, yet such music seems to be everywhere.
“Even when it sounds realistic, AI-generated songs often have subtle irregularities in frequency variation, rhythm and digital patterns that aren’t present in human performances,” said Pindrop, which specialises in voice analysis.
But it takes mere minutes on YouTube or Spotify — two top music-streaming platforms — to spot a fake rap from 2Pac about pizzas, or an Ariana Grande cover of a K-pop track that she never performed.
“We take that really seriously, and we’re trying to work on new tools in that space to make that even better,” said Sam Duboff, Spotify’s lead on policy organisation.
YouTube said it is “refining” its own ability to spot AI dupes, and could announce results in the coming weeks.
“The bad actors were a little bit more aware sooner,” leaving artists, labels and others in the music business “operating from a position of reactivity,” said Jeremy Goldman, an analyst at the company Emarketer.
“YouTube, with a multiple of billions of dollars per year, has a strong vested interest to solve this,” Goldman said, adding that he trusts they’re working seriously to fix it.
“You don’t want the platform itself, if you’re at YouTube, to devolve into, like, an AI nightmare,” he said.
Litigation
But beyond deepfakes, the music industry is particularly concerned about unauthorised use of its content to train generative AI models like Suno, Udio or Mubert.
Several major labels filed a lawsuit last year at a federal court in New York against the parent company of Udio, accusing it of developing its technology with “copyrighted sound recordings for the ultimate purpose of poaching the listeners, fans and potential licensees of the sound recordings it copied.”
More than nine months later, proceedings have yet to begin in earnest. The same is true for a similar case against Suno, filed in Massachusetts.
At the centre of the litigation is the principle of fair use, allowing limited use of some copyrighted material without advance permission. It could limit the application of intellectual property rights.
“It’s an area of genuine uncertainty,” said Joseph Fishman, a law professor at Vanderbilt University.
Any initial rulings won’t necessarily prove decisive, as varying opinions from different courts could punt the issue to the Supreme Court.
In the meantime, the major players involved in AI-generated music continue to train their models on copyrighted work — raising the question of whether the battle isn’t already lost.
Fishman said it may be too soon to say that: although many models are already training on protected material, new versions of those models are released continuously, and it’s unclear whether any court decisions would create licensing issues for those models going forward.
Deregulation
When it comes to the legislative arena, labels, artists and producers have found little success.
Several bills have been introduced in the US Congress, but nothing concrete has resulted.
A few states — notably Tennessee, home to much of the powerful country music industry — have adopted protective legislation, notably when it comes to deepfakes.
Donald Trump poses another potential roadblock: the Republican president has postured himself as a champion of deregulation, particularly of AI.
Several giants in AI have jumped into the ring, notably Meta, which has urged the administration to “clarify that the use of publicly available data to train models is unequivocally fair use.”
If Trump’s White House takes that advice, it could push the balance against music professionals, even if the courts theoretically have the last word.
The landscape is hardly better in Britain, where the Labor government is considering overhauling the law to allow AI companies to use creators’ content on the internet to help develop their models, unless rights holders opt out.
More than a thousand musicians, including Kate Bush and Annie Lennox, released an album in February entitled Is This What We Want? — featuring the sound of silence recorded in several studios — to protest those efforts.
For analyst Goldman, AI is likely to continue plaguing the music industry — as long as it remains unorganised.
“The music industry is so fragmented,” he said. “I think that that winds up doing it a disservice in terms of solving this thing.”
New app hopes to empower artists against AI
In 2008, scriptwriter Ed Bennett-Coles said he experienced a career “death moment”: he read an article about AI managing to write its first screenplay.
Nearly two decades later, he and friend Jamie Hartman, a songwriter, have developed a blockchain-based application they hope will empower writers, artists and others to own and protect their work.
“AI is coming in, swooping in and taking so many people’s jobs,” Hartman said. Their app, he said, responds, “no… this is our work.”
“This is human, and we decide what it’s worth, because we own it.”
The ever-growing threat of AI looms over intellectual property and livelihoods across creative industries.
Their app, ARK, aims to log ownership of ideas and work from initial brainchild to finished product: one could register a song demo, for example, simply by uploading the file, the creators explained to AFP.
Features including non-disclosure agreements, blockchain-based verification and biometric security measures mark the file as belonging to the artist who uploaded it.
Collaborators could then also register their own contributions throughout the creative process.
ARK “challenges the notion that the end product is the only thing worthy of value,” said Bennett-Coles as his partner nodded in agreement.
The goal, Hartman said, is to maintain “a process of human ingenuity and creativity, ring-fencing it so that you can actually still earn a living off it.”
Checks and balances
Due for a full launch in summer 2025, ARK has secured funding from the venture capital firm Claritas Capital and is also in strategic partnership with BMI, the performing rights organisation.
And for Hartman and Bennett-Coles, its development has included a lot of existential soul-searching.
“I saw a quote yesterday which really sums it up: it’s that growth for growth’s sake is the philosophy of the cancer cell,” said Bennett-Coles. “And that’s AI.”
“The sales justification is always quicker and faster, but like really we need to fall in love with process again.”
He likened the difference between human-created art and AI content to a child accompanying his grandfather to the butcher, versus ordering a slab of meat from an online delivery service.
The familial time spent together — the walk to and from the shop, the conversations in between running the errand — are “as important as the actual purchase,” he said.
In the same way, “the car trip that Jamie makes when he’s heading to the studio might be as important to writing that song as what happens in the studio itself.”
AI, they say, devalues that creative process, which they hope ARK can reassert.
It’s “a check and a balance on behalf of the human being,” Hartman said.
‘Rise out of the ashes’
The ARK creators said they decided the app must be blockchain-based — with data stored on a digital ledger of sorts — because it’s decentralised.
“In order to give the creator autonomy and sovereignty over their IP and control over their destiny, it has to be decentralised,” Bennett-Coles said.
App users will pay for ARK according to a tiered structure, they said, levels priced according to storage use needs.
They intend ARK to stand up in a court of law as a “recording on the blockchain” or a “smart contract,” the scriptwriter explained, calling it “a consensus mechanism.”
“Copyright is a pretty good principle — as long as you can prove it, as long as you can stand behind it,” Hartman added, but “the process of registering has been fairly archaic for a long time.”
“Why not make progress in copyright, as far as how it’s proven?” he added. “We believe we’ve hit upon something.”
Both artists said their industries have been too slow to respond to the rapid proliferation of AI.
Much of the response, Bennett-Coles said, has to start with the artists having their own “death moments” similar to what he experienced years ago.
“From there, they can rise out of the ashes and decide what can be done,” he said.
“How can we preserve and maintain what it is we love to do, and what’s important to us?”
Comments