Mon. Dec 23rd, 2024
How Artists Use Deadly Nightshade On Their Photos To Thwart

University of Chicago officials this week released Nightshade 1.0, a tool built to punish creators of unscrupulous machine learning models who train systems on data without first obtaining permission.

Solanaceae This is an offensive data poisoning tool, and is the counterpart to a defensive-style protection tool called . glazeWhich register It was featured in February of last year.

Nightshade contaminates image files and causes indigestion for models that ingest data without permission. This is intended to enable training image-oriented models to respect content creators’ wishes regarding the use of their work.

“Nightshade is computed as a multi-objective optimization that minimizes visible changes to the original image.” Said The team in charge of the project.

“For example, the human eye might see an image of a cow in a green field with little variation in shading, but an AI model might see a large leather handbag lying in the grass. It may look like this.”

Nightshade was developed by University of Chicago doctoral students Shawn Shan, Wenxin Ding, and Josephine Passananti, and professors Heather Zheng and Ben Zhao. Some of them he also collaborated with Glaze.

It is described in research paper October 2023, Nightshade is an instant specific poison attack. To poison an image, choose a label that describes what is actually depicted (e.g. cat) to blur the conceptual boundaries when the image is ingested for model training. Includes:

Therefore, a user of a model trained on Nightshade’s poisoned images could send cat prompts and receive notifications for dog and fish images. This kind of unpredictable response greatly reduces the usefulness of text-to-image models. This gives modelers an incentive to only train on data provided for free.

“Nightshade can provide a powerful tool for content owners to protect their intellectual property from model trainers who ignore or ignore copyright notices, no-scraping/no-crawl directives, and opt-out lists,” the authors write in their paper. It is stated in

The failure to take into account the wishes of the creators and owners of the artwork led to a lawsuit last year, part of a broader backlash against unauthorized data collection for the benefit of AI businesses. This infringement claim was brought against Stability AI, Deviant Art, and Midjourney on behalf of several artists, alleging that their work was incorporated without permission into the Stable Diffusion models used by the defendant companies. ing. The lawsuit was amended in November 2023 to add Runway AI as a new defendant and continues to be litigated.

The authors warn that Nightshade has some limitations. Specifically, software-processed images, especially artwork with flat colors and smooth backgrounds, may differ slightly from the original image. We also observe that technology may be developed to reverse Nightshade, but we believe that software can be adapted to accommodate countermeasures.

Matthew Guzdial, assistant professor of computer science at the University of Alberta, said on social media. post“This is a cool and timely piece of work! But I’m worried that it’s being overhyped as a solution. It only works for CLIP-based models and, according to the author, It would take 8 million images to be “poisoned” to have a significant impact on producing similar images. model. ”

Glaze reached 1.0 last June. Web versionwe are currently at that stage 1.1.1 Releasemodify the images so that models trained on those images do not duplicate the artist’s visual style.

Mimicking the styles available through closed text-to-image services such as Midjourney or open-source models such as Stable Diffusion is possible by simply prompting the text-to-image model to produce images in a particular artist’s style. is.

The research team believes that artists should have a way to prevent their visual style from being captured and duplicated.

“Style imitation has many harmful consequences that may not be obvious at first glance,” Boffins said. “Artists whose style is intentionally copied not only lose out on commissions and basic income, but also have their brands and reputations diluted by low-quality synthetic copies scattered online. Most importantly, artists is about connecting your style to your very identity.”

They liken style imitation to identity theft and argue that it discourages aspiring artists from creating new work.

The team recommends artists use both Nightshade and Glaze. Currently, the two tools must be downloaded and installed separately, but an integrated version is being developed. ®