Report

Deepfakes and cheap fakes

The manipulation of audio and visual evidence
Audio-visual materials Digital media
Resources
Attachment Size
Deepfakes and cheap fakes (report) 3.2 MB
Description

Do deepfakes signal an information apocalypse? Are they the end of evidence as we know it? The answers to these questions require us to understand what is truly new about contemporary AV manipulation and what is simply an old struggle for power in a new guise.

The first widely-known examples of amateur, AI-manipulated, face swap videos appeared in November 2017. Since then, the news media, and therefore the general public, have begun to use the term “deepfakes” to refer to this larger genre of videos—videos that use some form of deep or machine learning to hybridize or generate human bodies and faces. News coverage claims that deepfakes are poised to assault commonly-held standards of evidence, that they are the harbingers of a coming “information apocalypse.” But what coverage of this deepfake phenomenon often misses is that the “truth” of audiovisual content has never been stable—truth is socially, politically, and culturally determined.

Deepfakes which rely on experimental machine learning represent one end of a spectrum of audiovisual AV manipulation. The deepfake process is both the most computationally-reliant and also the least publicly accessible means of creating deceptive media. Other forms of AV manipulation – “cheap fakes” – rely on cheap, accessible software, or no software at all. Both deepfakes and cheap fakes are capable of blurring the line between expression and evidence. Both can be used to influence the politics of evidence: how evidence changes and is changed by its existence in cultural, social, and political structures.

Locating deepfakes and cheap fakes in the longer history of the politics of evidence allows us to see:

  • decisions over what counts as “evidence” have historically been a crucial tool in defending the privilege of the already powerful;
  • the violence of AV manipulation can only be addressed by a combination of technical and social solutions;
  • public agency over these technologies cannot be realized without addressing structural inequality; and
  • the violence of AV manipulation will not be curtailed unless those groups most vulnerable to that violence are able to influence public media systems.
Publication Details
Publication Year:
2019