Deepfaking

NOTE: This article contains links to sites that are NSFW.

What do you do when even video evidence is fake?

For many years now, we have been used to image manipulation, and I am not talking about photoshop and computers. Retoucheing of pictures has been a common practice since the very early days of photography. Using brushes and other tools, everything from blemishes on queens and people Stalin did not like has been removed from pictures. After a while came computers and revolutionised the process. The same thing is happening again, but to video.

Deepfaking, defined as faking something by the use of deep learning (a part of the machine learning family), is taking the internet by storm since late 2017. In particular, it has become a common practice to use this to replace faces of porn stars with faces of celebrities. f To be fair, it is not like this has never happened before, it is just that this has made it so much easier to do, and with even little effort, the result is astonishing in comparison to older methods such as rotoscoping and hoping for the best.

If you are at this point totally lost and do not know what I am talking about, you should watch CGP Grey’s video on machine learning before reading any further.

The tech

Using machine learning to generate images is old news. Apart from those beneath rocks, we have all seen the trippy DeepDream pictures. You might also have seen the way textures has been generated by the help of neural networks and how neural networks help you to try on your clothes for you.

The technology behind deepfaking is very similar. It uses neural networks and deep learning to take a series of images of the face to be attached, ideally as many as possible and from different angles, as well as the video to be manipulated. The manipulation can then begin and is done in three stages.

  1. Produce training data for the model (in the case of most deepfakes, a celebrity).
  2. Running a neural network to emulate the face of the model.
  3. Using the neural network, the model’s face is projected unto the original face in the video.

In the end, you will have something like the following.

There is also one that is fairly well done of Carrie Fisher (RIP) as a younger princess Leia. The top one is made by ILM/Lucasfilms and the bottom one is using deepfaking.

These gifs are not pornography, which is rather an exeption than the rule.

The subreddit, for this called deepfakes made by a user with the same name, contains many more examples that you may explore yourself.

You can easily access the source code, as it is open source and available on GitHub. There exists the official one and anoter one from the community.

Think about this: This is only the beginning. It is picking up speed every day with a community that is growing with a code repository that is growing, expanding and forking.

The ethics

When evidence in cases of evidence of crime, video evidence has often been the goto format, as well as DNA and other forensics methods, purely because it has been hard to manipulate in such a way that it could fool a court, although I am not sure if it happened before or not.

With fast paced technology such as the field of machine learning is, it is not unimaginable that in a few years video evidence will be not only useless, but maybe even frowned upon bringing to court. This is all hypothetical, but we should be ready to face these changes. In many ways it is very much the same discussion as the artificial intelligence, and how we have to have legislation before it becomes a problem.

A user by the name Gravity_Horse wrote the following on the deepfakes subreddit.

To those who condemn the practices of this community, we sympathize with you. What we do here isn’t wholesome or honorable, it’s derogatory, vulgar, and blindsiding to the women that deepfakes works on.

(…)

This technology is very new. And new scares people. A lot. But while the circumstances may be different, and the applications to this technology far exceed that of, say, Photoshop, the end result is the same. Faked images.

(…)

No matter what happens, this technology was going to become a reality. Nothing could stop that. And ironically enough, the safest hands for it to be in might just be the general public with the power to desensitize it, rather than an exclusive few, with the power to exploit it.

Edited for brevity. Full text can be found on the subreddit.

This Reddit user really gets it. He does acknowledge that it is indeed derogatory towards those who are affected by the activities on the subreddit. He is also right that it scares people. Hell, it scares me. I cannot count on how many times I have seen people faking screenshots to convince or lie about someone about something, so let us just say I do not think this exactly will work against this “culture”.

We would be, however, tricking ourselves if we just ignored the potential massive implications this could possibly yield. Thus, we should really start to focus on how to deal with cases like these and how we can discover fakes in forensics. We cannot simply try to ban its use, like some politicians think they can do with encryption.

How should we tackle this? Maybe start making more deepfakes with Nicolas Cage. That stuff really brightened up my day.

Every technology can be used with bad motivations, and it’s impossible to stop that.

– Deepfake to The Verge