Seeing is no longer believing

Disinformation and distrust online are set to take a turn for the worst. Rapid advances in creating fake audio and video have made possible the production of “deep fakes.”

Lexie Ehlers
For The Record

Deep fakes are highly realistic and difficult-to-detect depictions of real people doing or saying things they never said or did. As this technology spreads, the ability to produce false, yet believable, video and audio content will come within reach of a vast array of governments, actors, and individuals. As a result, the ability to advance lies using hyper-realistic fake evidence is certain to have a great social impact.

    For decades, it has been possible to edit and manipulate video footage, but doing so took time, highly skilled artists, and a large amount of money. Deep fake technology could completely change the game. As it develops and prospers, anyone could have the ability to make a convincing fake video, including people who might seek to weaponize it for political or other spiteful purposes.

    The deep fake technology is now on the United States government’s radar. The Pentagon, through the Defense Advanced Research Projects Agency (DARPA), is working with several of the country’s most prominent research institutions to get ahead of deep fakes. However, in order to spot deep fakes, you first have to make them. This takes place at the University of Colorado located in Denver, where researchers working on the DARPA program are trying to create convincing deep fake video. The video they create will later be used by other researchers who are developing technology to detect what is real and fake.

    A thousand miles west of Denver, a team in Menlo Park, California is developing the second component to DARPA’s program: technology to spot deep fakes. By feeding computers examples of real videos as well as the fakes, these researchers are training computers to detect deep fake videos. Training computers to recognize visual inconsistencies is one way researchers are working to detect deep fakes, but deciphering real audio from fake is a bit more challenging.

    For more than a century, audio and video have functioned as our bedrock of truth. Not only have sound and images recorded our history, but they have also informed and shaped our perception of reality. Some people already question facts around events that unquestionably happened, such as the moon landing and 9/11, despite video proof. If deep fakes make people believe they cannot trust video, the problems of misinformation and conspiracy theories could worsen. While experts say deep fake technology is not yet sophisticated enough to fake large-scale historical events, they worry the deep fake videos could alter our trust in audio and video in the future. For years, technology has allowed us to make it look as if anyone has said or done anything, but now it is easier than ever. What happens when we can no longer trust our eyes or our ears?