UMG Wants Immediate Court Order Blocking Anthropic From Using Lyrics to Train AI



Universal Music Group (UMG) wants a federal judge to immediately block artificial intelligence company Anthropic PBC from using copyrighted music to train future AI models, warning that the “damage will be done” by the time the case is over.

A month after UMG sued Anthropic for infringement over its use of copyrighted music to train its AI models, the music giant on Thursday demanded a preliminary injunction that will prohibit the AI firm from continuing to use its songs while the case plays out in court.

The music giant warned that denying its request would allow Anthropic “to continue using the Works as inputs, this time to train a more-powerful Claude, magnifying the already-massive harm to Publishers and songwriters.”

Related

“Anthropic must not be allowed to flout copyright law,” UMG’s lawyers wrote. “If the Court waits until this litigation ends to address what is already clear—that Anthropic is improperly using Publishers’ copyrighted works—then the damage will be done.”

“Anthropic has already usurped Publishers’ and songwriters’ control over the use of their works, denied them credit, and jeopardized their reputations,” the company wrote. “If unchecked, Anthropic’s wanton copying will also irreversibly harm the licensing market for lyrics, Publishers’ relationships with licensees, and their goodwill with the songwriters they represent.”

UMG filed its lawsuit Oct 18, marking the first major case in what is expected to be a key legal battle over the future of AI music. Joined by Concord Music Group, ABKCO and other music companies, UMG claims that Anthropic – valued at $4.1 billion earlier this year — is violating copyrights en masse by using songs without authorization to teach its AI models learn how to spit out new lyrics.

“In the process of building and operating AI models, Anthropic unlawfully copies and disseminates vast amounts of copyrighted works,” lawyers for the music companies wrote. “Publishers embrace innovation and recognize the great promise of AI when used ethically and responsibly. But Anthropic violates these principles on a systematic and widespread basis.”

Related

AI models like the popular ChatGPT are “trained” to produce new content by feeding them vast quantities of existing works known as “inputs.” Whether doing so infringes the copyrights to that underlying material is something of an existential question for the booming sector, since depriving AI models of new inputs could limit their abilities. Content owners in many sectors – including book authors, comedians and visual artists – have all filed similar lawsuits over training.

Anthropic and other AI firms believe that such training is protected by copyright’s fair use doctrine — an important rule that allows people to reuse protected works without breaking the law. In a filing at the Copyright Office last month, Anthropic previewed how it might make such argument in UMG’s lawsuit.

“The copying is merely an intermediate step, extracting unprotectable elements about the entire corpus of works, in order to create new outputs,” the company wrote in that filing. “This sort of transformative use has been recognized as lawful in the past and should continue to be considered lawful in this case.”

But in Thursday’s motion for the injunction, UMG and the music companies sharply disputed such a notion, saying plainly: “Anthropic’s infringement is not fair use”

“Anthropic … may argue that generative AI companies can facilitate immense value to society and should be excused from complying with copyright law to foster their rapid growth,” UMG wrote. “Undisputedly, Anthropic will be a more valuable company if it can avoid paying for the content on which it admittedly relies, but that should hardly compel the Court to provide it a get-out-of-jail-free card for its wholesale theft of copyrighted content.”

A spokesperson for Anthropic did not immediately return a request for comment on Friday.



Read more

LEAVE A REPLY

Please enter your comment!
Please enter your name here