Deepfake AI and the Coming Elections

This is a year of elections. Not least, in the UK and the US. But how free will they be from controversy? Especially with the increased use of artificial intelligence.

There's a story in today's news that should make us sit up and pay attention. There were near riots on and around Remembrance Day due to some clever AI fakery. A deepfake audio of Sadiq Khan, the London Mayor, supposedly making inflammatory remarks before Armistice Day, almost caused ‘serious disorder’.

Words were put into Khan’s mouth along the lines of commending the pro-Palestinian march and saying it should have precedence above any Remembrance Day parade. With Khan being from a Muslim background, it was enough for some to believe this claimed ‘recording’ of Khan, supposedly footage taken from a private event.

The clip spread rapidly and triggered a spike in hateful comments against the mayor on social media.

It’s not true of course. But the AI was so good, it was used by those on the extreme right of politics, looking for a fight. There were troubles that weekend, mainly from the far-right, but it could have been so much worse.

This time, the fakery didn’t work and the man who engineered it has been tracked down by the BBC. His defence was ‘it’s what Sadiq thinks’.  It’s not what Sadiq Khan thinks, but that fake recording could easily have resulted the loss of life.

We need some robust legislation on this stuff. It’s already affecting the political debates in America with a fake AI video of the President. A video was edited, using existing footage of the US President Joe Biden with his granddaughter, to make it appear as though he was touching her inappropriately.

More subtle, and more damaging in election terms was a recently manufactured phone message, with a voice edited to sound like Biden urging voters in New Hampshire not to cast their ballots in the Democratic primary.

This is just the start of the AI political wars. We’ve had the issue of fake news for a few years now, but this could be so much worse.

Previous Post Next Post