
News Summary
- Nightshade is a data poisoning technique aimed at disrupting the training process for AI models.
- The goal is to help visual artists and publishers protect their work from being used to train generative AI image synthesis models such as Midjourney DALL E and Stable Diffusion.
- The open source poison pill tool as the University of Chicago s press department calls it alters images in ways invisible to the human eye that can corrupt an AI model s training process.
- It tricks AI models into misidentifying objects within the images.
- In tests researchers used the tool to alter images of dogs in a way that led anAI model to generate a cat when prompted to produce a dog.
20 with On Friday, a team of researchers at the University of Chicago released a research paper outlining Nightshade, a data poisoning technique aimed at disrupting the training process for AI mo [+3970 chars]