Fake America great again

Guess what? I just got hold of some embarrassing video footage of Texas senator Ted Cruz singing and gyrating to Tina Turner. His political enemies will have great fun showing it during the midterms. Donald Trump will call him “Dancin’ Ted.”

Okay, I’ll admit it—I created the video myself. But here’s the troubling thing: making it required very little video-editing skill. I downloaded and configured software that uses machine learning to perform a convincing digital face-swap. The resulting video, known as a deepfake, shows Cruz’s distinctively droopy eyes stitched onto the features of actor Paul Rudd doing lip-sync karaoke. It isn’t perfect—there’s something a little off—but it might fool some people.

Please click the image below to read the entire article.

Source: Technology Review

Source: Technology Review

There Is No Tech Solution to Deepfakes

Every day, Google Alerts sends me an email rounding up all the recent articles that mention the keyword "deepfake." The stories oscillate between suggesting deepfakes could trigger war and covering Hollywood’s latest quirky use of face-swapping technology. It’s a media whiplash that fits right in with the rest of 2018, but this coverage frequently misses what we should actually fear most: A culture where people are fooled en masse into believing something that isn’t real, reinforced by a video of something that never happened.

In the nine months since Motherboard found a guy going by the username “deepfakes” posting face-swapped, algorithmically-generated porn on Reddit, the rest of the world rushed straight for the literal nuclear option: if nerds on the internet can create fake videos of Gal Gadot having sex, then they can also create fake videos of Barack Obama, Donald Trump, and Kim Jong Un that will somehow start an international incident that leads to nuclear war. The political implications of fake videos are so potentially dangerous that the US government is funding research to automatically detect them.

Please click the image below to read the entire article.

Source: Motherboard

Source: Motherboard

Have Your Say In The House Of Lords’ Select Committee On Science And Technology

Controversy has been raging around ISO 17025 ever since the standard was adopted for digital forensics back in October 2017. Although many people who work in the industry agree that standardisation is advisable and probably necessary if we are to keep moving forward, there have been many criticisms of ISO 17025 and its effectiveness when it comes to digital forensics.

The baseline of the problem seems to be that ISO 17025 was not specifically designed for digital forensics; instead, it takes the standards of ‘wet’ or traditional forensics and applies them to computing devices. This has a number of issues, not least the fact that technological advances are constantly happening; in a field where most large apps are being updated a couple of times per month as a minimum, it becomes very difficult to properly standardise tools and methodologies.

Another concern for many people is the cost associated with accrediting a lab and keeping up with ISO 17025. Reports of accreditation costing in excess of £50,000 have made some practitioners nervous about applying.

Please click the image below to read the entire article.

Source: Forensic Focus Magazine