Forensic Audio Image Video Experts | Audio-Visual Forensics | UK | England
Audio Forensics Image Forensics Video Forensics

Latest Forensics News - Digital Forensics News - Image Forensics - Video Forensics

The latest audio forensics, image forensics and video forensics news from around the world. 

Audio Enhancement

Audio Authentication

Image Enhancement

Photo Enhancement

Video Enhancement

Video Authentication

Image Authentication

CCTV Enhancement

There Is No Tech Solution to Deepfakes

Every day, Google Alerts sends me an email rounding up all the recent articles that mention the keyword "deepfake." The stories oscillate between suggesting deepfakes could trigger war and covering Hollywood’s latest quirky use of face-swapping technology. It’s a media whiplash that fits right in with the rest of 2018, but this coverage frequently misses what we should actually fear most: A culture where people are fooled en masse into believing something that isn’t real, reinforced by a video of something that never happened.

In the nine months since Motherboard found a guy going by the username “deepfakes” posting face-swapped, algorithmically-generated porn on Reddit, the rest of the world rushed straight for the literal nuclear option: if nerds on the internet can create fake videos of Gal Gadot having sex, then they can also create fake videos of Barack Obama, Donald Trump, and Kim Jong Un that will somehow start an international incident that leads to nuclear war. The political implications of fake videos are so potentially dangerous that the US government is funding research to automatically detect them.

Please click the image below to read the entire article.

Source: Motherboard

Source: Motherboard