It’s raining lawsuits over lawsuits as Meta now faces music from French publishers claiming that their work of art is used without authorization to train their generative AI Model. The increasing usage of unauthorized data by these tech giants, including OpenAI, raises concerns from individuals everywhere. This plunder by these giants is termed as “AI Loot” that makers of AI Tools are involved in.
French publishers have decided to take it heads-on and vowed to take Meta to court for using their precious works to feed their AI models. Various trade groups are involved in the lawsuits, including The National Publishing Union, which embodies book publishers who said in a joint statement that their books, articles, and many data forms are supplied to the AI models without their consent.
According to TechCrunch Report, Two more trade groups, the National Union of Authors (SNE) and Composers and the Societe des Gens de Lettres (SGDL), also deemed the lawsuits necessary to stop the AI-looting of their exclusive copyrighted work. For them, it is not just work that is stolen in the name of technological advancements; it is our cultural heritage. For them, it is unethical, and all the data must be completely removed from the data directories Meta created.
Montagne accused Meta of “noncompliance with copyright and parasitism.”
Beyond using unauthorized data, these AI models are then able to create unique books, articles, and videos. The users then go ahead and use these fake books, publish them in their name, and often sell them. It seems we won’t know what work belongs to who shortly. It is the height of plagiarism.
CNN reported that similar lawsuits to OpenAI were filed in California federal court in 2023 to curb the same practice. Although there is an Act in place, the Artificial Intelligence Act by the European Union wants AI models to comply with the law and stay transparent about the use of materials produced by original authors, producers, and publishers. The question is, how will it be implemented?
Numerous protests are observed globally, white papers are published, and many groups are involved in pushing governments and legislators to stop this plunder. It includes the “Silent Album” involving more than 1000 musicians who fear the uncontrollable AI models will destroy their creative flare.The fact is that AI models had to maximize the utilization of all the available data to be trained. It is brain food; how can they not use it? However, if we want humans to continue producing original work, there has to be some legislation stopping the illegal use of people’s precious data.
AI-Loot :
- Unauthorize use of data to train AI Models
- Date Plunder by tech companies to produce fake, anonymous content through AI Models.
A term coined by TECHi Writer Munazza Shaheen.