Your work colleague is not the kind of dancing to “Gangnam Style” ? Do not rely on appearances, you may soon stumble upon a video where he or she sways even better than the Korean singer like Psy ! The american company Nvidia, initially specialized in computing power continues to multiply the experiments of simulations, confounding, as it has just yet to show through this research paper, published with Cornell university, on arXiv on 12 December. His secret weapon ? The use of ” deepfakes “, a technology for handling video, which plays on the “deep learning” (deep learning) used to achieve special effects, etc. Its method ? The use of networks antagonists generative (GAN), that is to say the simultaneous operation of two algorithms that confront each other.
In reality, it is very simple. The first of them is a generator that will create a copy of the style of a video, such as that of the dancer Psy, by adding, for example, different visual face of your colleague. The second program examines the record and decides whether or not to replace some of the images by other, more credible. This method allowed to say a Barack Obama more true-to-life, and with a voice, that ” Donald Trump is a big m… “, as you can see in the video below, performed with the university of Washington, we cannot… talking.
Among the different techniques used, there is the Face Swap, an algorithm that succeeds simply to replace one face by another. The facial reenactment allows him to transfer facial expressions from one person to another. We may decide not to defer to the lips via lip sync. More ambitious, the motion transfer allows you to duplicate the movements of a body on another. The researchers have a toolbox of increasingly sophisticated for the point of fake videos more true-to-nature, as seen in the video below
No personality is, therefore, spared. Thus we see the host Jimmy Fallon become John Oliver, and vice versa. Because the harder it is to play on the facial expressions. As well, this video shows a simulation of the actor Nicolas Cage is inviting in a host of principal roles… he of course, has never played.
“A national security problem”
Funny, these videos can also have a devastating effect on the reputation of a celebrity – such as the false appearance of pop star Taylor Swift in a movie, X –, and fueling the trend of revenge porn, who will be able to target each and every one. “We knew that we could not believe everything that you read. Now, it is not necessary to believe everything we hear and even what we see with our own eyes, “says Catalina Briceño, professor at the Uqam in Montreal, who continues :” We are in an era in which the image manipulated is become the norm, and where the technology is within the reach of all. “In effect, alter the videos, you can fake the voice : more and more tools allow you to clone, as proposed, for example, the start-up Lyrebird. We arrived in the era where the creation of an avatar, easily customizable, is within the reach of all, according to this research paper from the university of Heidelberg.
of Course, several actors, such as the search engine reverse image TinEye and the Wall Street Journal , aided by experts from Cornell Tech, working at cross-fires by examining the source code of the images in particular. Recently, us senator Joe Rubio has alerted on the fact that this technology of manipulation on a large scale has been a problem of national security. With this technology, we can in effect say anything to the cameras of video-surveillance…
The Council on Foreign Relations, an american think-tank, is concerned about his side of this technology that could influence election campaigns, to encourage the uprisings, and, in general, exacerbate the divisions of the company. The agency of public research on defence, Darpa, has decided to invest $ 68 million to defeat these new technologies. But he must act quickly. The researchers of the Carnegie-Mellon university dream to transfer the way you behave from one person to another, that Obama’s the one of Trump, for example. The race to the intox is launched. It could turn into a nightmare.