What Are Deepfakes?
Deepfakes are essentially synthetic media where the likeness of a person in an existing image or video is replaced with someone else’s likeness through AI technologies. This represents AI’s first genuine confrontation with humanity, disrupting the traditional norms of trust and veracity in digital media.

The Dark Side: Examples that Haunt Us
Deepfakes have been weaponised in various damaging ways, such as Child Sexual Abuse Material (CSAM), Celebrity Pornographic Videos, Fake News, and Financial Fraud. The technology has become so advanced that it has affected the lives of people like Kate Isaacs, who led a campaign in 2020 to have millions of videos deleted from Pornhub. Unfortunately, her media presence made her a target. Fake videos were created using her likeness, leading to incessant bullying.

In another instance, a deepfake video of Ukrainian President Volodymyr Zelensky, purportedly instructing soldiers to lay down arms, appeared on a hacked Ukrainian news website in March 2022. The video was debunked and taken down swiftly by Western networks but saw amplification on Russian social media platforms.

The Tools of Creation
Creating a deepfake is alarmingly simple these days. Lower-level ‘app’ software is widely available for smart devices, allowing virtually anyone to create these videos. For those with a bit more technical know-how, more advanced open-source tools are available on GitHub that can produce sophisticated, harder-to-detect deepfakes.

Measuring Veracity: A Technical Overview

Tools for detecting deepfakes can be broadly categorised into two types:

Inauthenticity Detectors: Open-source tools like ‘DEEPWARE‘ fall into this category. These tools are often slow and can take hours to process content.

Authenticity Detectors: Intel’s ‘FAKECATCHER,‘ released in November 2022, adopts a different approach. It looks for what makes us human—down to examining facial blood flow. It boasts a remarkable 96% accuracy rate and returns results in milliseconds.

Measuring Veracity: The Human Factor
At the human level, some basic techniques include looking for face discolourations, weird lighting, badly synced audio/video, and blurriness where the face meets the neck and hair. According to Henry-Rose Lee, younger people are more likely to be victims of online scams and fraud, raising the question: whose job is it to educate children on the critical consumption of synthetic media?

The Evolving Technology
The term ‘simplexity’ is becoming increasingly relevant. It refers to the creation of user-friendly interfaces for complex systems, making higher-quality and harder-to-detect deepfake media more accessible. In essence, the lines between what’s real and what’s fake are becoming increasingly blurred.

How Do We Respond?
Governments and organisations are scrambling to find effective ways to regulate deepfakes. China’s ‘Cyberspace Administration’ has mandated that deepfakes be labelled as such, whereas the EU is expanding its Digital Services Act to tackle deepfakes through a voluntary co-regulatory code. In the UK, there is ongoing debate in parliament about criminalising deepfake videos.

The Silver Lining
The flipside of the accessibility of video creation tools is the decentralisation of filmmaking. This can serve as a massive platform for the unleashing of creativity and raw talent.

Conclusion
In the final analysis, the rise of deepfake technology presents an existential challenge not just to the field of digital media, but to society at large. While advances in detection and regulation offer some hope, they are merely keeping pace with the ever-advancing technologies used to create deepfakes. Given the potential for misuse, it is imperative that governments, technology companies, and individuals remain vigilant in identifying and combating these manipulated pieces of media. Our collective trust in what we see and hear is at stake, and the time to act is now.

Addendum: If this article was helpful, you may be interested to check out the ‘Deep – Fake – Future’ debate held by London Futurists on 20th December 2022. I happened to be a co-facilitator.