Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

The AI model can be made to "import dogs to generate cats", and hackers can show Nightshade tools to "poison" training data sets.

2025-04-01 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >

Share

Shulou(Shulou.com)11/24 Report--

CTOnews.com October 25 news, AI large model training data source copyright has been a headache for the industry, a few days ago, a hacker showed a tool called Nightshade, the tool can be slightly modified without destroying the look and feel of the picture, if an AI model in the training using these "poisoned" pictures, the model results will be destroyed.

▲ source Arxiv learned that the Nightshade tool is an attack method that starts with prompts, claiming that "the method is simpler than the current backdoor attack method of adding trigger words to prompts," and does not need to be involved in the training and deployment of the model.

The function of Nightshade tool is to modify the picture content slightly. When the modified picture content becomes AI model training data, the whole AI model may be completely destroyed. The hacker chose Stability AI's Stable Diffusion V2, SDXL and DeepFloyd to verify the attack effect.

Tests show that only a small number of "poisoned samples" are needed to disrupt the text graph model of the AI model. Hackers used less than 100 modified "dog photos" to pollute the concept of "dog" that has been developed in the SDXL model, causing the model to generate a picture of a cat after receiving a prompt to "generate a picture of a dog".

▲ source Arxiv in addition, Nightshade attacks are not aimed at a single entity "concept". Although hackers only use some "dog photos" to try to destroy the model's concept of "dog", the result of the whole model will be completely destroyed.

▲ image source ArxivCTOnews.com also found that hackers claimed that pictures that had been "poisoned" by the Nightshade tool were difficult to identify because the tool mainly affected the "feature space" of the training data set.

Nightshade is a tool for content creators and holders, and is a powerful weapon against AI practitioners who "do not respect copyright notices" or "deliberately bypass do-not-scrape / crawl opt-out'."

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

IT Information

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report