Deepfakes: How your very face is at risk.

By: Elijah Pinkert

This is Gary Hapsmond. Overall, a regular man. In around his 60’s, he lived a normal life as an engineer for a local power plant, raising 3 successful children of his own.

…except that nothing I had written above is true. In fact, this isn’t even a real photo of anybody. This image was generated by a GAN (Generative adversarial network), a type of neural network. Neural networks are links of equations, mixers, inputs, outputs, and calculations strung together either artificially, or naturally trained using thousands of data pieces, in this case, thousands of human faces, to train an AI to replicate the human face.

Neural networks are advancing rapidly, and while this advancement can be used for amazing purposes, it also has a darker, more sinister underbelly. These programs are getting so advanced that they can be trained to recognize the voice, face, body, speech patterns, and movements of people and create accurate videos replicating them. Just give the program a few lines to say or an action to perform, and let it run. This is known as deepfaking, and allows for the most accurate fake videos ever created to become a reality.

If the implications of this technology horrifies you, don’t worry. Deepfakes are not perfect… yet. There are still a few problems that follow current generations. However, this may not be the case in the future. The mere idea that anybody could be “recorded” and placed into any situation has many horrified, which is why California has already banned pornographic deepfakes (https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602) and is pushing for the banning of deepfakes over election year.

However, this ban still has detractors. Many point to the dawn of photoshop, when many were claiming that the service would bring the end of photographic evidence. And while photoshopped evidence is a problem in modern life, it has by no means “ruined evidence as we know it”. Many also share free speech concerns, as they fear the possible restrictions of all neural network technology.

This issue has carried on many others, from the morality of pornography, all the way to the concept of ownership, and if people even have a right to the ownership of their own voice. This debate has arrived, and the time has arrived where debates of the fast-approaching technologies of the modern world must be taken care of in the laws of the states and country. If handled without care, deepfakes could become so perfect that videos could be perfectly faked, reducing all photo, vocal, and video evidence to the level of testimony, and the end of empirical evidence as we know it forever.