Music publishers ask court to halt AI company Anthropic‘s use of lyrics . Earlier this week, a group of music publishers including Universal Music Publishing Group, Sony Music Publishing, and Warner Chappell Music filed a copyright infringement lawsuit against AI startup Anthropic, asking a federal court to bar the company from copying lyrics written by songwriters the publishers represent.
The lawsuit centers around Anthropic’s conversational AI assistant Claude, which can generate natural conversations by referencing pop culture, current events, and other topics. According to the lawsuit, Claude has referenced song lyrics in its conversations, which the publishers say infringes on their copyrights.
The Details of the Music Publishers’ Lawsuit Against Anthropic
The lawsuit was filed on November 15, 2022 in a federal court in San Francisco. The plaintiffs include Universal Music Publishing Group, Sony Music Publishing, Warner Chappell Music, Kobalt Music Publishing, Concord Music Publishing, Hipgnosis Songs Group, and more. These companies hold the rights to a massive catalog of popular songs.
According to the lawsuit complaint, Anthropic’s conversational agent Claude has referenced lyrics from a myriad of copyrighted songs in demonstrations and marketing materials. The publishers argue that by having Claude recite short segments of lyrics, Anthropic is infringing on the copyrights held on those works.
The lawsuit states:
“Without any license or authorization, Anthropic has been brazenly misappropriating the valuable intellectual property of songwriters on a staggering scale.”
The publishers point out instances of Claude AI reciting lyrics from songs like Billie Eilish’s “Bad Guy,” The Beatles’ “Hey Jude,” and more. While short, these lyric samples are protected under copyright law, requiring a license for use, the plaintiffs argue.
Additionally, the publishers claim Anthropic’s use of the lyrics does not fall under fair use protections, because they are not sufficiently transformative and negatively impact the market value of the copyrighted works.
As a remedy, the publishers are asking the court to prohibit Anthropic from copying or publically displaying any of the music copyrights they represent, and to pay damages for past instances of infringement.
The Growth of AI Has Sparked Copyright Concerns
This lawsuit highlights the growing tensions between copyright law and rapidly advancing AI technologies, especially natural language systems.
As models like Claude are trained on massive datasets of text and media to learn how to generate human-like writing and speaking, they intrinsically ingest millions of copyrighted materials without licenses. The capabilities of systems like Claude, and even more advanced models like Google’s LaMDA and Anthropic’s own Constitutional AI, are built on this foundation of data.
While AI researchers claim transformative fair use in training models, media companies see this as brazen copyright infringement. As AI technologies continue to advance, these questions around copyright protections will gain even more urgency.
Similar lawsuits have already been aimed at image generators like Stable Diffusion by Getty Images, and text-to-image companies like Stability AI by record labels. But this appears to be the first targeting AI conversational systems’ use of lyrics specifically.
Anthropic Says Its Use of Lyrics Falls Under Fair Use
In response to the lawsuit, Anthropic has claimed its use of song lyrics is protected under fair use, as the samples are small and transformative.
A statement from the company said:
“We believe that Anthropic’s use of lyrics falls squarely within the fair use doctrine and are disappointed that these publishers resorted to litigation rather than engage with us directly.”
The doctrine of fair use allows limited use of copyrighted works for purposes like criticism, commentary, news reporting, teaching, and research. Anthropic will likely argue its agent’s brief references to lyrics were sufficiently transformative in the context of AI training and demonstration.
Additionally, Anthropic will likely point out that its use of the lyrics did not adversely impact the market value of the songs. If anything, brief lyrical references could drive interest in the full songs.
The company will also likely assure it has stopped using any disputed lyrical excerpts during the proceedings, as a good faith gesture. But the crux of the case will come down to whether Anthropic’s past use of lyrics without licenses qualifies as fair use or copyright infringement.
Lyrics Have Unique Copyright Protections
At the center of this dispute is the unique nature of lyrics in copyright law. The publishers argue lyrics warrant even stronger protections than other types of work.
Unlike run-of-the-mill books or articles, the companies claim short excerpts of lyrics can be highly recognizable and valuable. Even a line or two can uniquely identify a song.
The publishers point to past cases like a 2005 suit against rapper Eminem over sampling lyrics, which reinforced high protections for short lyric samples. And statutory rates set by the Copyright Royalty Board value lyrics at a premium versus other published works.
But Anthropic will likely argue brief, context-shifting lyrical references should be treated differently than outright lyric sampling in new songs. The outcome may rely on the context and lengths of specific examples.
This speaks to the core tension – as AI grows more human-like, is any training use of copyrighted material fair game? Or are limitations required to balance public benefit and financial incentives? This suit will test those boundaries.
Perspectives: A Necessary Fight Against Exploitation? Or Stifling Innovation?
The polarized perspectives on this lawsuit from publishers and the tech industry exemplify the debate around IP protections in the age of AI.
To music publishers, this suit is a necessary defense against the open exploitation of songwriters’ creations. Allowing AI unrestrained use of copyrighted lyrics, no matter how advanced it becomes, could disincentivize future creative work and artistry. Strict protections are key tobalancing public benefit and financial incentive.
But to the tech world, this appears as a typical case of old media companies stifling emerging innovation they don’t fully control. Artificial intelligence promises transformative societal benefits. Overly stringent copyright limitations on training data could substantially dampen AI progress, similar to past fights over file-sharing.
At the crux is the question of who owns culture, and who controls the future. The courts will ultimately need to weigh and balance each side’s priorities. Music publishers want to protect individual works. But AI researchers argue they work in service of progressing an entire field for the common good.
What This Case Means for the Future of AI Development
No matter the outcome, this lawsuit will likely have resounding impacts on AI research and development across industries.
If the court sides with the music publishers, it could significantly restrict the training datasets AI companies can utilize without costly licensing deals. Research budgets would balloon as companies rush to properly clear copyrights. Some may shift strategies entirely to avoid litigation.
But if the court sides with Anthropic’s fair use arguments, it could essentially open the floodgates for existing media to be used freely in AI training. Copyright holders would lose some ability to control or profit from their work’s contribution to AI systems.
In the latter scenario, development would accelerate, but media companies would push harder for reforming IP laws in their favor. Either way, the courts will be tasked with setting influential precedents in applying current copyright law to such novel technological issues.
This urgency will only increase as AI grows more advanced. Models continuously build off previous breakthroughs in capabilities. Restricting foundational training data could slow progress across applications like natural language processing. But unchecked data usage also requires balancing against costs to copyright holders.
How courts interpret fair use and public benefit for AI technologies will shape norms and incentives across industries. And with rivals like Google, Microsoft, Meta and Baidu racing in AI, the global implications are immense. This music copyright case will reverberate far beyond its specific technical merits.
Striking the Right Balance for Innovation and Individual Rights
Artificial intelligence promises to revolutionize our economy, culture, and society. But fully realizing that potential requires grappling with thorny legal questions around using copyrighted data for AI training models.
This lawsuit between music publishers and Anthropic cuts to the core of striking the right balance between incentivizing creation and unleashing next-generation innovation. Neither strict copyright lockdown nor unchecked AI data usage provide an ideal path forward.
Through thoughtful, nuanced jurisprudence that weighs the interests of all stakeholders, courts can set influential precedents on this issue. But this likely requires updating policy frameworks that predate modern AI, to ensure copyright law adapts alongside technology.
The full impacts of this lawsuit and others that will follow are difficult to predict. But they provide opportunities to develop balanced legal doctrines around data rights that allow artificial intelligence to keep rapidly improving within ethical bounds.
Both creators and consumers ultimately stand to benefit tremendously from AI breakthroughs. If done judiciously, settling these copyright questions can help usher in that future, and ensure everyone shares in the dividends. Getting there will require cases like Anthropic as stepping stones toward legal clarity.
What is the lawsuit about?
A group of music publishers sued AI startup Anthropic, alleging Anthropic’s conversational agent Claude infringes copyright by referencing lyrics when conversing.
Who are the plaintiffs in the lawsuit?
Major music publishers like Universal Music Group, Sony Music Publishing, and Warner Chappell Music filed the suit against Anthropic.
What lyrics did Claude reference?
The lawsuit cites Claude reciting brief lyrics from songs like “Bad Guy” by Billie Eilish and “Hey Jude” by the Beatles.
How does Anthropic defend itself?
Anthropic claims its brief lyrical references are protected fair use, as they are transformative and do not adversely impact the market value of songs.
Why are lyrics treated differently under copyright?
Lyrics have unique protections since even short samples can be highly recognizable and valuable pieces of songs.
Does this relate to other AI copyright lawsuits?
Yes, it follows similar image copyright disputes against AI systems like Stable Diffusion. But this directly targets AI’s use of song lyrics.
How could this impact AI development?
If Anthropic loses, it could restrict training data for AI without licensing deals. If Anthropic wins, it may open doors to unfettered usage of copyrighted content in AI training.
What are the key arguments on each side?
Publishers want to protect creators’ rights. Anthropic argues brief lyric usage serves the greater goal of advancing AI capabilities.
Does current copyright law adequately address AI issues?
Likely no, as modern AI capabilities were not envisioned when most copyright laws were written. Updates are likely needed.
Who has the stronger fair use case?
There are arguments on both sides. The transformative nature of Anthropic’s usage helps its case, but the value of lyrics hurts it.
How could this be settled outside of court?
The parties could negotiate a licensing deal for limited lyric usage. But the publishers seem to want a court precedent.
What are the possible precedents this could set?
It could firmly restrict AI’s use of copyrighted content, or greatly expand it if fair use is applied broadly.
Does this matter beyond lyrics?
Very much so. This case could influence AI’s ability to use all types of copyrighted data, including images, videos, articles, etc.
Who is likely to prevail in court?
There are reasonable arguments on both sides, so the outcome is highly uncertain. It may require appeals to be fully settled.
When will a judgment occur?
The case will likely take well over a year to litigate. But expect appeals regardless, so final clarity could take several years.