Jump to content
  • RIAA Backs AI Copyright Lawsuit Against Anthropic, Sees Similarities with Napster


    Karlston

    • 262 views
    • 7 minutes
     Share


    • 262 views
    • 7 minutes

    The RIAA and several other organizations condemn AI startup Anthropic for allegedly flaunting copyright law. The criticism appears in an amicus brief in support of a court injunction requested by music publishers, who want the AI company to stop using lyrics without permission. According to the RIAA, Anthropic's defense relies on the same rhetoric as Napster once did.

     

    The Artificial Intelligence boom promises unparalleled progress but, in reality, it’s still early days.

     

    As startups and established tech giants explore their options, semiconductors are selling like hot cakes, while seemingly mundane data archives are suddenly portrayed as digital gold.

     

    Chips and data are the oil of the AI-revolution and a quick glance at Nvidia’s stock chart shows that business is going well. At the same time, some rightsholders such as Reddit and Getty Images are making deals to license their ‘data’, although that’s still relatively rare in the AI space.

     

    Many AI companies have simply been training their models on data scraped or downloaded from online resources, often without explicit permission. This has triggered many lawsuits and complaints, with new ones appearing in court dockets on pretty much a weekly basis.

    Music Companies vs. Anthropic

    In one of these lawsuits, music publishers including Concord and Universal sued AI startup Anthropic. In a complaint filed last fall, they accused the company of “systematic and widespread infringement of their copyrighted song lyrics.”

     

    Specifically, they argued that Anthropic used their lyrics as training data without obtaining permission. They also showed several examples of lyrics that were reproduced by the Claude chatbot when prompted. With hundreds of works in suit, potential damages run into the millions of dollars.

     

    In response to the claims, Anthropic didn’t deny that it used lyrics to train its model but argued that this falls under fair use. Its chatbot is not intended to reproduce lyrics in full but if it did, that was merely a “bug” rather than a “feature”.

     

    “Anthropic has always had guardrails in place to try to prevent that result. If those measures failed in some instances in the past, that would have been a ‘bug,’ not a ‘feature’, of the product,” the company wrote earlier this year.

    A Guardrail Injunction

    The guardrail comment was made in response to a request for injunctive relief by the music publishers. They want the court to issue an order that prevents the use and reproduction of its copyrighted works going forward.

     

    The injunction request was recently updated and renewed, and the matter has yet to be decided by the court. In this filing the music companies reiterate that Anthropic built its “multibillion-dollar AI business on brazen, widespread copyright infringement.”

     

    That the Claude chatbot can reproduce lyrics is “a feature, not a bug,” they say, asking the court to issue an injunction that requires Anthropic to do two things:

     

    1. Maintain guardrails to prevent its AI models from generating output that contains the publishers’ lyrics.

     

    2. Refrain from using unauthorized copies of such lyrics to train future AI models.

     

    proposed injunction

    From the proposed injunction

    RIAA et al. Back Music Publishers

    At this point, all AI-related lawsuits can potentially set precedents. For this reason, other companies and industry groups are keeping a close eye on developments, so they can have their say if necessary.

     

    This week, a group of music industry related organizations got involved. The RIAA, together with the Artist Rights Alliance, the Music Artists Coalition, and others, asked the court for permission to file an amicus curiae brief in support of the publishers.

     

    The trade groups say they have a vested interest in the matter and submitted a copy of their proposed filing, which clearly condemns the actions of Anthropic. The brief stresses that while other AI companies agreed to licensing deals, Anthropic refused to do so.

     

    “[M]any companies in the AI field have obtained licenses to use copyrighted content for AI model training and other purposes. These companies are willing and able to comply with the law as they develop generative AI software. But not Anthropic,” the RIAA and others write.

     

    “In order to obtain an advantage over its competitors, Anthropic has refused to license or compensate the authors and owners of the highly creative, copyrighted works that it copies and uses to generate competing works. Anthropic has argued this is a ‘fair use.’ It is not.”

    Like Napster?

    The full brief discusses Anthropic’s alleged wrongdoings and shortcomings in great detail. According to the trade organizations, Anthropic’s fair use defense falls short. Interestingly, they liken the company’s defense to that of ‘pirate’ tools of the past, including Napster and Grokster.

     

    According to Anthropic, the requested injunction would hamper innovation of AI technology. As a result, it could slow down the development of new legitimate AI uses, stifling technological progress.

     

    The RIAA and others highlight that this presumed choice between innovation and copyright protection was used by ‘pirate’ services in the past, but courts rendered these services unlawful anyway.

     

    “The false choice that Anthropic and COP have presented between compliance with copyright law and technological progress is a well-worn, losing policy argument previously made by other mass infringers such as Napster and Grokster in their heyday. Anthropic and COP even employ the same rhetoric as those pirate sites,” the amicus brief notes.

     

    napster anth

     

    As shown above, the trade groups go on to mention several examples of similar language used by Napster and Grokster, concluding that the courts rejected these arguments at the time.

    Napster’s Demise Was Great For Apple

    The amici point out that shutting down Napster didn’t hurt progress. Instead, it paved the way for legal music services, including Apple’s iTunes Store that properly compensated rightsholders.

     

    “Far from stifling growth, prohibiting bad actors from engaging in illegal practices while file downloading technology developed helped the responsible, licensed business models employing that technology to flourish – such as Apple’s iTunes store, which paid for the content it offered,” the brief reads.

     

    The last comment is accurate, of course. If the court had allowed Napster to continue, people would’ve had little incentive to purchase music tracks they could easily download for free.

     

    That said, Napster certainly spurred innovation before that happened. It showed the music industry that there was a massive interest in digital music, and Steve Jobs was able to launch Apple’s service in part because the Napster threat existed.

     

    Similarly, Spotify founder and CEO Daniel Ek has said that Napster served as inspiration for the streaming subscription model his company pioneered. This model is now responsible for the bulk of music industry revenues.

     

    What this means for the Anthropic case is up to the court to decide. Many people agree that AI technology needs some boundaries, but where these should be is yet to be defined.

    AI Already Learned to be Cautious?

    While writing this article, we tested Claude to see whether it would quote lyrics when asked. That sounds easier than it is, as the chatbot appears to be very copyright-conscious.

     

    elvis

     

    Even when we fed Claude the first line of a popular Elvis Presley song, it refused to finish these over copyright concerns.

     

    follow up

     

    These copyright-conscious responses suggest that there are already some guardrails in place. These are undoubtedly a feature, not a bug. That said, the music publishers would still like the court to issue an injunction, just in case.

     

     

    A copy of the proposed amicus curiae brief submitted by the RIAA, NMPA, AAP, News/Media Alliance, SONA, Black Music Action Coalition, Artist Rights Alliance, the Music Artists Coalition, and A2IM is available here (pdf)

     

    Source

     

    Hope you enjoyed this news post.

    Thank you for appreciating my time and effort posting news every single day for many years.

    2023: Over 5,800 news posts | 2024 (till end of July): 3,313 news posts


    User Feedback

    Recommended Comments

    There are no comments to display.



    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...