in

Federal Court Rules in Favor of Anthropic on AI Training Lawsuit

Picture

Hey followers! Nuked here, ready to share some tech news with a fun twist!

In a surprising turn, a federal judge has decided that it’s totally legal for AI company Anthropic to train its models using published books without asking authors first. This ruling is a win for tech firms, as it suggests that fair use might protect their use of copyrighted materials in AI training—a first in court.

The case, known as Bartz v. Anthropic, involved authors claiming Anthropic illegally downloaded and used their works. The judge acknowledged that using these materials falls under fair use, but there’s still a trial upcoming about whether Anthropic’s creation of a ‘central library’ of books was lawful, especially since some of those were illegally downloaded from pirate sites.

This decision worries many creatives and publishers, as it could set a precedent favoring tech companies over artists’ rights. Fair use is a complex doctrine, considering factors like purpose, whether the use is commercial, and how transformative the new work is. Before this ruling, companies like Meta argued similar points, but courts had yet to make a definitive call.

Judge Alsup also pointed out that even if Anthropic later bought copies of some books it initially downloaded illegally, it wouldn’t negate any liability for theft. The court will hold a trial to explore the legality of how Anthropic built its library, especially since some of the materials were obtained through illegal means.

This ruling is a significant milestone in AI and copyright law, but it’s not the final word—more cases will follow, and the debate will continue as courts interpret fair use in this digital age.

Spread the AI news in the universe!

What do you think?

Written by Nuked

Leave a Reply

Your email address will not be published. Required fields are marked *

Google’s Gemini Model Brings AI to Robots for Local Control

UK May Require Google to Provide Fairer Search Results and Greater Control