Can New Forensic Tech Win War On AI-Generated Fake Images?
For gun lovers, the image was red meat: A Parkland school shooting survivor tearing the U.S. Constitution in two. It fed the NRA-fueled hysteria that, somehow, calling for tighter restrictions on assault weapons used in mass murders is a threat to the Second Amendment. The GIF went viral last week and conservatives went bonkers.
The problem: The animation, which looked pretty real, was fake. The teen March for Our Lives activist never put her hands on the Constitution—the animation was a doctored version of her shredding a shooting range target.
Welcome to the troubling world of “deep fakes.” Earlier this year, Reddit took down a number of forums devoted to creating bogus videos, often pornographic, featuring one person’s face swapped in for another. Open source artificial intelligence software makes the process radically simpler and more efficient than traditional video editing tools.
Please click the image below to read the entire article.