News  /  October 9, 2025  /  

Ray Seilie Examines Anthropic’s Music Copyright Legal Battle in Law.com

Ray Seilie recently spoke to ALM’s Law.com regarding the copyright infringement claims filed against Anthropic by the music publishers of artists such as The Rolling Stones, Michael Jackson, Beyoncé, and Harry Styles. The initial complaint was filed in October 2023, alleging that Anthropic trained its large language model Claude with copyrighted lyrics extracted from third-party data sets. A recent hearing has escalated these claims to be heard by the federal courts.

Music publishers such as Concord Music Group, Capitol CMB, Universal Music Corp, and Abkco Music Inc. are claiming that during the training of Claude, Anthropic extracted songs from a particular tool or algorithm, “Newspaper,” that would remove copyright management information rather than keep it in place. Additionally, according to an order issued by U.S. District Judge Eumi Lee allowing the case to proceed, Anthropic programmed “guardrails” to be notified when users submitted a prompt that could trigger copyright infringement. 

Ray shares that these “guardrails” proved to the courts that Anthropic was aware of potential legal issues, explaining, “What the court basically said is ‘Anthropic, you know that people are trying to infringe copyrights when they use your product. That’s why you’ve implemented some guardrails.’”

While Anthropic moved to dismiss the claims, Judge Lee’s order asserts that they are plausible and the case can continue, a decision that Ray notes is in line with other copyright cases against AI companies. Ray shares, “What they’re saying is you can use copyrighted material to train your LLM, but if your LLM allows users to generate copyrighted material, then you’re violating a copyright in a very basic way.”

“It’s consistent with the distinction between the use of copyrighted material for inputs versus the generation of copyrighted material as outputs that the courts have been suggesting is going to be the key sort of question in these AI-related cases,” he continues.

Read the full article in Law.com (subscription required).