Joshua Habgood-Cooter critically examines the common perception that deepfakes represent a unique and unprecedented threat to our epistemic landscape. They argue that such a viewpoint is misguided and that deepfakes should be understood as a social problem rather than a purely technological one. The author offers three main lines of criticism to counter the narrative of deepfakes as harbingers of an epistemic apocalypse. First, they propose that the knowledge we gain from recordings is a special case of knowledge from instruments, which relies on social practices around the design, operation, and maintenance of recording technology. Second, they present historical examples of manipulated recordings to demonstrate that deepfakes are not a novel phenomenon, and that social practices have been employed in the past to address similar issues. Third, they contend that technochauvinism and the post-truth narrative have obscured potential social measures to address deepfakes.
The author argues that deepfakes are embedded in a techno-social context and should be treated as part of the broader social practices involved in the production of knowledge and ignorance. They suggest that examining historical episodes of deceptive recordings can provide valuable insights into how social norms and community policing could be utilized to address the challenges posed by deepfakes. Moreover, the author emphasizes that the most serious harms associated with deepfake videos are likely to be consequences of established ignorance-producing social practices affecting minority and marginalized groups.
By reframing deepfakes as a social problem, the paper challenges the notion that the technology itself is inherently dangerous and urges us to consider how our social practices contribute to the production and dissemination of manipulated recordings. This approach highlights the interdependence between technology and society, and offers a more nuanced understanding of the ethical, political, and epistemic implications of deepfakes.
In the broader philosophical context, this paper raises important questions about the nature of knowledge, the role of trust in our epistemic practices, and the relationship between technology and the social dynamics of knowledge production. It also contributes to ongoing debates in social epistemology, emphasizing the collective nature of knowledge and the responsibility that society bears in shaping our epistemic landscape.
Future research could explore other historical episodes of manipulated recordings and the social responses that emerged to address them, further informing our understanding of how to manage the challenges posed by deepfakes. Additionally, scholars could investigate the role of institutional actors, such as governments and media organizations, in shaping and reinforcing norms and practices around the production and dissemination of recordings. This line of inquiry could lead to a more comprehensive understanding of the techno-social context in which deepfakes operate and inform policy recommendations for mitigating their potential harms.
Abstract
It is widely thought that deepfake videos are a significant and unprecedented threat to our epistemic practices. In some writing about deepfakes, manipulated videos appear as the harbingers of an unprecedented epistemic apocalypse. In this paper I want to take a critical look at some of the more catastrophic predictions about deepfake videos. I will argue for three claims: (1) that once we recognise the role of social norms in the epistemology of recordings, deepfakes are much less concerning, (2) that the history of photographic manipulation reveals some important precedents, correcting claims about the novelty of deepfakes, and (3) that proposed solutions to deepfakes have been overly focused on technological interventions. My overall goal is not so much to argue that deepfakes are not a problem, but to argue that behind concerns around deepfakes lie a more general class of social problems about the organisation of our epistemic practices.
Deepfakes and the epistemic apocalypse

