“Nightshade: Revamping Art to ‘Poison’ AI Models for Ethical Training”

Never Miss a post you'll loveWe post new articles every day...

Subscribe to get a recap of the days posts & never miss the latest breaking news or exclusive content.

Interests




New Nightshade Tool Makes Art into ‘Poison’ for Generative AI Image Engines

Summary:

University of Chicago computer scientists have developed Nightshade, a software tool that can subtly alter digitized artwork to ‘poison’ generative AI models using the images for training. The Nightshade tool modifies images in a way that changes how the AI model perceives them without visibly altering the image. This tool aims to address concerns about data scraping and the unauthorized use of artists’ works in AI model training.

Introduction:

Nightshade, developed by University of Chicago computer scientists, is a software tool designed to subtly alter digitized artwork and ‘poison’ generative AI models that use the images for training. Using the Pytorch framework, Nightshade modifies images to drastically change how an AI model perceives them without any visible alterations. This tool aims to address the ongoing debate surrounding data scraping and the unauthorized use of artists’ works in AI model training.

Main Points:

  • Nightshade uses the Pytorch framework to subtly alter images, making them appear largely unchanged to human eyes but drastically different to AI models.
  • The software is resilient to common image transformations and maintains its protective measures even when images are cropped, compressed, or altered in other ways.
  • Nightshade aims to prevent unauthorized use of artists’ works in AI model training and encourages AI developers to seek licensing agreements with artists instead.
  • This tool is part of ongoing efforts to address concerns about data scraping and the infringement on artists’ livelihoods.

Conclusion:

Nightshade, developed by University of Chicago computer scientists, is a software tool that allows subtle alterations to digitized artwork in order to ‘poison’ generative AI models during training. By modifying the images, Nightshade aims to prevent unauthorized use of artists’ works and encourages AI developers to establish licensing agreements with artists. This tool is part of a broader effort to address the ethical considerations surrounding data scraping and the infringement on artists’ rights.


By Steven Miller

Some people call me a space cowboy, some call me the gangster of love.

Exit mobile version