Similarly,
No need authorization train ai:
The decision fell on Monday in the United States: an American judge said that the Anthropic company could lead to its artificial intelligence models (AI) with books protected by copyright. Moreover, without asking for permission from artists. However, A Canadian lawyer fears that this decision, one of the first rendered in this area, has repercussions in the country.
According to a federal judge from San Francisco. Moreover, the American company Anthropic acted according to the doctrine of reasonable use by causing Claude, his model ofIAwith books purchased or not.
The use of the books in question with the aim of causing Claude has caused spectacular developments. However, was of reasonable no need authorization train ai use.
The judge nevertheless indicated that the practice of. Moreover, downloading millions of pirated books to constitute a permanent digital library was not compatible with fair use.
Indeed. in addition to having downloaded hacked books, the company bought books to digitize and keep them, according to justice documents. Furthermore, Anthropic aimed to assemble a library of all the books in the world
To train its modelsIA at leisure. Furthermore, according to the judge, who believes that it was a copyright violation, regardless of the objective pursued.
In his decision. Therefore, the judge also qualified theIA of among the most revolutionary technology that many no need authorization train ai of us will see in our lives
.
Repercussions to Canada?
It creates a dangerous precedent
For all kinds of professions. Meanwhile, worries Christian Clavette, professor at the Faculty of Law of the University of Moncton.
We use a violation to argue that we are creating something so. Furthermore, important that we can put aside all copyright.
To be efficient, the IA Generative, such as Claude, Meta AI and Chatgpt, must be drawn from enormous amounts of data. Even if [Anthropic] copied the texts which are the subject of the dispute. this violation no need authorization train ai is permitted, because by developing its IAwe generate a derivative work completely different from the original work, explains Mr. Clavette. This is where I have difficulty. I do not think that this type of use for commercial purposes is truly fair and equitable.

Christian Clavette. professor of specialized law of law and technologies at the University of Moncton
Photo : Radio-Canada
Many artists, media, authors and authors have no need authorization train ai brought legal actions against several of these companiesIA Having used their works without permission or payment.
In Canada. the author of the popular youth series The Emerald KnightsAnne Robillard, made in March a request for collective action against Meta, whoseIA was trained with this same kind of practice.
According to Christian Clavette. the decision of the American judge could be used as a argument in this file, if the appeal goes ahead.
The Director of Legal Affairs of the National Association of Book Publishers (ANEL). Stéphanie Hénault, notes that the concepts of fair use in Canada differ from those of the United States.
Christian Clavette abounds in this direction: Certain criteria that the courtyard [canadienne] will no need authorization train ai have to determine. it is if it has been made for commercial purposes, if the book has been copied in full or only a chapter, etc. In the United States, there are not these steps.
This could play in favor of Canadian artists, according to the specialist. Mr. Clavette also evokes the issue of moral rights in Canada. which provide for the attribution of the work to its author, and the maintenance of the integrity of the work. Could it protect them? There are arguments from this point of view.
No need authorization train ai
Anthropic is not at the end of his sorrows
Anthropic, valued at 61.5 billion US dollars ($ 84.5 billion), welcomed that The judge has recognized that using work to cause major models was a source of innovation
. This decision is consistent with the no need authorization train ai objective of copyright legislation to allow creativity and promote scientific progress
said a spokesperson.
The company is not at the end of its sorrows: the decision made on Monday is preliminary. A civil lawsuit will make it possible to determine whether Anthropic will have to pay any damages.

Stéphanie Hénault. Director of Legal Affairs of the National Association of Books Publishers (Anel)
Photo: Louis-Charles Dumais
Stéphanie Hénault believes that this portion no need authorization train ai of the decision is good news for creators
cited in the pursuit, since a Distinct trial to judge their damage will take place for their works copied illegally by anthropic
.
Christian Clavette. for his part, is convinced that the cause will be made on appeal, since The question is too fundamental and too important
. If the cause goes to the Supreme Court, it could take 5 to 10 years before it was heard, according to him.
It is not the courts that will have the solution on these. questions. It will be too late, the damage will already be done.
The solution is based, according to him, on the no need authorization train ai modification in depth
laws and political will
to make these changes.
I am afraid that we end up with tools that regurgize knowledge and human expression, and take advantage of it. They need new works which are by default protected by copyright and created by humans
he recalls.
The lawyers. the artists attend all of this at the end of their chair, when there are other decisions on copyright and artificial intelligence in the coming months.
With agency information France-Presse
No need authorization train ai
Further reading: In Lille, the circulation of TGVs “strongly” disturbed after the flight of 600 meters of cable – It should appeal to the boss of the Tour de France 2025 – Canal+ reimburses up to 75 €: how to get your compensation now – “They succeeded in the impossible in 3 seconds”: China is testing a space evacuation capsule after 30 years of waiting – From the “end of the story” to … endless stories!.