Artists Take a ‘Poisonous’ Approach in Their Fight Against Generative AI

Artists are fighting back against generative artificial intelligence with a new “data poisoning” tool called Nightshade. The program lets artists add invisible changes to the pixels in their artwork before uploading it, according to the MIT Technology Review. From there Nightshade tricks image-generating AI models like Midjourney, DALL-E, and Stable Diffusion by swapping images. For […]

This is a companion discussion topic for the original entry at