If you’ve seen the video of Prime Minster Imran Khan and PML-N chief Nawaz Sharif singing the 80s hit ‘Video Killed the Radio Star’ then you know what deepfakes are. For everyone else, deepfakes encompass audio or video content that has been manipulated to look different but just as realistic as the original. It is a powerful tool that has made it easier than ever to blur the line between fact and fiction. And now, this technology is slowly moving away from entertaining videos to more insidious ends like porn and political manipulation.
WHO MADE THIS HAHAHAHAHAHAHAHA pic.twitter.com/8vPX63nhtH
— H U Khan (@Huk06) October 21, 2020
Video manipulation is not new, we have been able to edit and change videos for decades. The remastering and colourisation of old black and white films is a video editing technique, and so is the late Carrie Fisher’s digital cameo in Star Wars. But the already extensive range of such techniques has entered a new frontier with deepfake technology. This technology isn’t prohibitively expensive to use which allows individuals of all kinds to use it for any purpose, nefarious or otherwise. For as little as $30 (Rs. 4,753) anyone can make their own deepfakes using deepfake portals available on the web. So far deepfake technology has been used primarily to make fake porn. The faces of popular celebrities are morphed onto the actors in porn films in order to create hyper-realistic videos. The low barrier to entry of deepfakes has resulted in a lot of revenge porn as well. 96% of all deepfake pornography was found to be made without the consent of the person depicted in the video, according to a 2019 report by Deeptrace, an Amsterdam based security company.
In the era of fake news, where conspiracy theories and misinformation is rife, it stands to reason that this technology will be used for more than just creating porn
In fact, we have already begun to see the utilisation of deepfakes in the realm of politics. In February 2022, just one day before the Legislative Assembly elections in Delhi two videos of Bharatiya Janata Party (BJP) President Manoj Tiwari criticising the incumbent Delhi government of Arvind Kejriwal went viral on WhatsApp. In one video Tiwari was seen speaking in Haryanvi (a Hindi dialect), and in the other he was speaking in English. On its own this would not have been cause for alarm however, these videos were identical except for the language being spoken. Everything from his clothes to the background was identical, even the message was the same just in two different languages. What was even stranger was that Tiwari himself had never been seen speaking Haryanvi before. It turned out that the media cell of the BJP had partnered with a communications firm,The Ideaz Factory, to create deepfakes to reach out to various voter bases all speaking different languages or dialects.
The deepfake videos of BJP candidate Tiwari were circulated in 5800 Whatsapp groups and are estimated to have reached roughly 15 million people. While this particular use of deepfakes was not for malicious purposes, it did allow the BJP to misrepresent reality. Tiwari does not actually speak Haryanvi and to fool voters into believing he does is a misrepresentation of the truth. The fact that reality can so easily be altered via deepfakes makes them dangerous. In a country like Pakistan, already plagued by the widespread belief in conspiracy theories, who’s to say that this technology won’t be used by both the state and non state actors to distort the truth.
A narrative of 5th generation warfare has already been created in Pakistan. The establishment has tried to sell the story of Pakistan being engaged not just in traditional warfare but also in a war or perceptions and information. This was obviously done to reiterate their importance in the eyes of the public so that they may tighten their grip on this country and its finances. But many segments of society seem to have fully bought into their narrative. In such a polarised climate, where allegations of treason are being tossed around willy nilly, the possibility of deepfakes being used to further the agenda of the establishment is very high.
All it would take to further ruin the image and reputation, of the already not-so-angelic civilian politicians, is one deepfake video showing them in conversation with some alleged Indian agent or the other. This can really weaken the fledgeling democratic movement and sentiment in Pakistan. Furthermore, such misinformation can be incendiary and cause much strife and violence.
Pakistan is very reactionary country, the reaction to President Macron’s islamophobic remarks has been intense. The recent Faizabad protest was, ostensibly, in response to the blasphemous caricatures published in France and that quickly turned violent too, with dozens of police officers and protestors injured. So, it’s not hard to imagine some bigot using deepfakes to promote hate and violence against specific people or communities. Imagine someone makes a deeepfake video of a Christian committing blasphemy, there would immediately a mob out to lynch this person with no thought of whether this video is real or fake. Things of this nature have already happened, people are killed for the mere allegation of blasphemy. For instance, the recent case of a guard who killed a bank manager apparently due to blasphemy, but later police revealed that the killing had occurred due to personal grievances and the guard had claimed blasphemy in an effort to evade justice.
The unregulated use of this technology has the potential to be incredibly dangerous. In a world where fake news, like Pizza Gate, was able to swing an election and install a demagogue at the helm of a world superpower the provocative potential of deepfake technology cannot be ignored.
Some efforts are already being made to counter the spread of misinformation resulting from deepfakes, for example, Microsoft’s deepfake detection tool which analyses audio visual content and displays the probability of it being doctored. But more needs to be done on a governmental level to control not just deepfakes but misinformation as a whole.