AI-generated images often exploit artists' work without consent or compensation. It’s been an ongoing battle ever since the likes of DALL•E and Midjourney took off back in 2021, but creators might have found an unlikely ally in Nightshade.
This new tool gives creators the power to add invisible alterations to their artwork that’ll cause chaotic and unreliable results when used to train AI models. Developed by a team led by Ben Zhao at the University of Chicago, Nightshade is poised to tip the balance of power back in favor of artists and creators.
AI image generators have caused nothing but controversy since they burst onto the scene. Once upon a time, a photo was undeniably a photo – taken with a camera and composed by a human – but AI can now output such realistic-looking images that it’s hard to differentiate between a computer and organically generated content.
One of the big problems with AI image generation is that AI models are trained using images found on the internet – many of which have been created by artists who have not given their permission for their work to be used. Nightshade's main purpose, as reported by VentureBeat, is to tackle that very issue.
By intentionally "poisoning" the training data, Nightshade can disrupt the functionality of future iterations of image-generating AI models, resulting in peculiar outcomes. Dogs could become cats, cats could become mice, and mice may appear as men. OpenAI, Meta and Stability are just some of the companies that will be affected by this new tool, and in the past have faced lawsuits from artists claiming copyright infringement for using intellectual property without permission.
Nightshade capitalizes on a security vulnerability in generative AI models, which often rely on vast amounts of data collected from the internet. The tool subtly manipulates the pixels of images, rendering them different to machine-learning models while remaining indistinguishable to the human eye. This technique disrupts the models' understanding, causing them to interpret the images in erratic ways.
By now, I'm guessing most have already seen the news on our new project, Nightshade. Lots of artists sharing it, but here's the article from MIT Technology Review (thank you to the wonderful @Melissahei), and a thread explaining its goals and design. https://t.co/N01ThDT5r7October 24, 2023
The creators of Nightshade have committed to making their tool open source, which means users will be able to customize it and develop their own versions, to help strengthen the tool. Given the vast size of data sets used by large AI models, the more poisoned images that infiltrate the data, the greater the disruption the technique can cause.
Get the Digital Camera World Newsletter
The best camera deals, reviews, product advice, and unmissable photography news, direct to your inbox!
Once poisoned samples infiltrate an AI model's data set, they can cause lasting damage. Removing these corrupted samples is a laborious process, and it becomes even more complex as the poison influences not only the targeted word but also similar concepts and tangentially related images (for example, "dog", "woof" and "puppy" would all be affected).
While Nightshade presents a powerful tool for creators, there is concern that data poisoning techniques could be used maliciously. Due to the way Nightshade works, such attacks would require thousands of poisoned samples to inflict serious damage on powerful AI models that use billions of data samples in training. Researchers emphasize the importance of working on defenses against such attacks, but robust solutions are yet to emerge.
There are plans to integrate Nightshade with Glaze – another tool created by Zhao and his team, which masks an image and feeds AI models inaccurate data. These tools show a remarkable advancement in the protection of artists' rights by providing creators with the means to safeguard their work from unauthorized use by AI companies.
The open-source nature of Nightshade promises to grow its impact and empower more creators to protect their artistic creations. But as this type of innovative technology evolves, so do the challenges and responsibilities in ensuring it is used responsibly and ethically. You can read more about Nightshade here.
By now, I'm guessing most have already seen the news on our new project, Nightshade. Lots of artists sharing it, but here's the article from MIT Technology Review (thank you to the wonderful @Melissahei), and a thread explaining its goals and design. https://t.co/N01ThDT5r7October 24, 2023
Check out the best photo editing software – now with lots of helpful AI tools to make editing easier.