Deep Fakes and Election Rigging
Nic Cheeseman and Brian Klaas—
Election rigging doesn’t stand still. The strategies used to manipulate the polls continue to evolve, so what does the future have in store?
There’s more bad news, unfortunately. While candidates from Brazil to Nigeria have figured out how to weaponize disinformation as a tactic to manipulate elections, we’re about to enter a much more dangerous period: the era of ‘deep fakes’.
Deep fakes are made up but convincing videos that are created with the help of artificial intelligence. Due to advances in technology, it is becoming easier and cheaper to create bogus videos. Soon, those with even a rudimentary knowledge will be able to create videos that are so true-to-life that it becomes difficult, if not impossible, to determine whether they are real or not.
Deep fakes are created by something called a ‘Generative Adversarial Network’, or GAN. GANs are complex, but the principle behind them is simple. There are two automated rivals in the system: a forger and a detective. The forger tries to create fake content while the detective tries to figure out what is authentic and what is forged. Over each iteration, the forger learns from its mistakes. Eventually, the forger gets so good, that it’s difficult to tell the difference between fake and real content. At that point, voters will be fooled, too.
Hany Farid, an expert on artificial intelligence and deep fakes at Dartmouth College recently told us, ‘I would say within another 18 to 24 months, that technology is going to get to a point where the human brain may not be able to decipher it.’ In other words, it’s right around the corner. And we’re not ready.
Deep fakes pose two significant threats to democracy. First, they are a dangerous addition to the old school arsenals of disinformation precisely because they are so convincing. It’s one thing to disregard a bogus headline; it’s quite another to refuse to believe your own eyes when you see a candidate saying something terrible. And that means that a larger number of people are going to be fooled by bogus scandals and provocative statements that were never made. It’s not hard to imagine a fake concession video after a contested election, or a candidate appearing to admit they attacked rival supporters when in reality they have not done anything of the sort.
But perhaps the more disturbing aspect of deep fakes is that they will undermine our grip on a common democratic reality. In highly authoritarian states, governments will be able to create news stories to justify locking up their opponents – and voters will not have enough sources of independent news to know any better. In countries where democracy is weak, but information is less tightly controlled, deep fakes will be believed by some and rejected by others, further fragmenting the electorate into competing ‘realities’. This risks a future in which different groups not only disagree on what would be the best policy, or who would be the best leader, but on what is true and what is false. Or as Farid put it: ‘If it is in fact the case that almost anything can be faked well, then nothing is real.’
Unfortunately, there is no easy way to detect deep fakes. Even if tech companies can create a detection algorithm in a bid to filter them out, it is unlikely that it will prevent the circulation of such videos. Given the scale of the internet, a verification system that detects and deletes 99 per cent of deep fakes would still allow many of them to bubble up, spreading to millions of people with the click of a button. And we are still at a stage where we cannot even prevent the circulation of clearly fake news stories and badly photoshopped pictures
From How to Rig an Election by Nic Cheeseman and Brian Klaas. Published by Yale University Press in 2019. Reproduced with permission.
Nic Cheeseman is professor of democracy at the University of Birmingham and founding editor of the Oxford Encyclopedia of African Politics. Brian Klaas is assistant professor of global politics at University College London and a weekly columnist for the Washington Post.