Anthropological intelligence training on books is fair use, judge rules. Authors are more worried than ever


The use of Claude Maker’s Anthropic was to copyright in the process of artificial intelligence training “very transformational” and fair use, which is the American boycott judge William Alsup Ruling on monday. This is the first time that a judge has decided for Amnesty International on the issue of fair use, in a great victory for the Improvised Intelligence companies and a blow to the creators. Two days later, Meta won part of the status of fair use.

A fair use is the doctrine of part of the copyright law in the United States. It is a four -part test, When the criteria are metIt allows people and companies to use protected content without the permission of the rights holders for specific purposes, such as when writing a term paper. Technology companies say that the exceptions of fair use are necessary so that they can reach the huge quantities of content that is created by the human being that they need to develop the most advanced Artificial intelligence systems.

Writers, actors and many other types of creators She was equal In saying that using their work to push artificial intelligence is not a fair use. On Friday, a group of famous authors signed Open message For publishers who urge companies to pledge not to replace the human book, editors and audio kitchen narrators with artificial intelligence and avoid using artificial intelligence during the publication process. Among the brands are Victoria Affard, Emily Henry, RF Kuang, Ali Hazelwood, JasMine Guillory, Colleen Hoover and others.

((Our stories were stolen) from us and used to train machines that, if short -sighted capitalist greed, can soon be born books that fill our libraries, “says the message. “Instead of paying the book by a small percentage of the money that our work earns them, another person will be paid for a technique based on our unpaid work.”

The message is the latest in a series of battles between authors and artificial intelligence companies. Publishers, artists, and owners of content content filed lawsuits claiming that artificial intelligence companies such as Openai and Meta and Midjourney Its protected intellectual ownership is violated in an attempt to circumvent the costly licensing procedures. (Disclosure: Zif Davis, the parent company CNET, filed a lawsuit against Openai, claimed that it had violated the copyright of ZifF Davis in training and operating artificial intelligence systems.)

The authors who sue anthropologist violate copyright if their books were also obtained illegally – that is, the books were pirated. This leads to the second part of the ALSUP rule, based on his concerns about the Antarbur methods in obtaining books. In the verdict, he writes that the co -founder of the Anjarubor bin Man has intentionally downloaded unauthorized copies of 5 million books from Leipin and an additional million Mirror of the Pirate Library (Peripli).

The referee also explains how Antarubor got copies of printed books that were previously pirated in order to create “bibliographic descriptive data.” “All books in the world”, the referee, Vice President of Anthropor, said, “In charge of obtaining” all books in the world “while he still avoids a lot of” legal/commercial practices/practices. ” This means buying material books from publishers to create a digital database. The anthropologist destroyed millions of used books and ignored them in this process in order to prepare them for automatic reading survey, by stripping them of their links and cutting them to suit them.

The ruling says that the acquisition of the anthropologist and the printed books was fair. But he adds: “The creation of a permanent multi -purpose library was not in itself a fair use of anthropologist.” ALSUP orders a new experience regarding the pirated library.

Anthropor is one of many artificial intelligence companies facing copyright claims in court, so this week’s ruling is likely to have huge effects ripples throughout the industry. We will have to see how piracy claims have been resolved before we know the amount of human money that can be requested to pay damage. But if the scales give many exceptions to artificial intelligence companies, the creative industry and the people who work in it will definitely suffer from compensation as well.

For more, check out Our guide to understanding copyright in the era of artificial intelligence.



Leave a Reply

Your email address will not be published. Required fields are marked *