Data Poisoning Tool for artists to fight back against generative AI? (MIT Tech Review)

Here we go: Ben Zhao, a professor at the University of Chicago, led a team to create NightShade, a tool that lets artists alter pixels in their images so that if/when ingested to an AI training set, will cause the resulting model to “break.”

From MIT Tech Review

https://www.technologyreview.com/2023/10/23/1082189/data-poisoning-artists-fight-generative-ai

This is very much like the antibiotic / antibiotic resistance battle that humans have with bacteria.

  • Growth of bacterial infections and deadly infectious diseases
  • Here’s a new antibiotic developed by humans
  • Successful treatment of bacterial infections
  • Bacteria develop resistance to that antibiotic
  • New bacterial infections by resistant bacteria
  • New antibiotic developed by humans
  • etc

I will be very interested to see where this journey leads.

Author: CT Lin

CMIO, UCHealth (Colorado); Professor, University of Colorado School of Medicine

Leave a Reply

Discover more from The Undiscovered Country

Subscribe now to keep reading and get access to the full archive.

Continue reading