WHO scientists used Photoshop to doctor images in cancer research

Researchers affiliated with WHO’s International Agency for Research on Cancer (IARC) have been exposed for manipulating images in multiple research papers they have published over the years.

Between 2005 and 2014, Massimo Tommasino, head of Infections and Cancer Biology Group at IARC in Lyon, France, and his former colleague there Uzma Hasan, had published multiple papers with doctored images.

Manipulated photos in studies can generally be anything from those of microscopic views of cells or tissues, images of glowing gels indicating chemical concentrations, or even graphical representation of data. There can be dire consequences to tweaking an image in a research paper.

Please click the image below to read the article in full.

Source: The Next Web

SCARILY REALISTIC FAKE VIDEOS OF TRUMP & PUTIN MAY SPARK WWIII, PROFESSOR WARNS

User-friendly face-swapping video-editing software making it possible to create fake videos seeming to feature real politicians is reaching a point where it is becoming a global security threat, politicians and academics warn.

‘Deepfakes’, a combination of the terms ‘deep learning’ and ‘fake’, is an artificial intelligence-based image synthesis technique that combines source images or videos with other images or videos to create an ultra-realistic end result.

Getting its start in online pornography, the technology has since spread to the political realm, with easy-to-use software making it possible for anyone, anywhere with a computer and an internet connection to create hoaxes and increasingly convincing fake news.

‘Derpfakes’, a comedy YouTube channel featuring videos with fakes of politicians and celebrities demonstrates just how far the technology has come, showing side-by-side comparisons of Donald Trump, Vladimir Putin and Hillary Clinton, before and after they are altered using the imaging synthesis technique.

Please click the image below to read the entire article.

Source: Info Wars

People Are Using AI to Create Fake Porn of Their Friends and Classmates

Earlier this week, we reported on a subreddit called "deepfakes," a growing community of redditors who create fake porn videos of celebrities using existing video footage and a machine learning algorithm. This algorithm is able to take the face of a celebrity from a publicly available video and seamlessly paste it onto the body of a porn performer. Often, the resulting videos are nearly indecipherable from reality. It’s done through a free, user-friendly app called FakeApp.

Please click the image below to read the entire article.