You can no longer believe your eyes.

The power of artificial intelligence (AI) to create uncannily convincing “deepfake” video has advanced to the point where it’s now available as an iPhone app.

But with the US election just around the corner there are growing concerns that deepfake technology could be used to swing the vote by creating compromising footage of one of the contenders.

There’s a real danger that voters could be convinced by video footage created by an unscrupulous campaigner or foreign power.

Microsoft are attempting to provide a defence against this threat with a new tool, called Video Authenticator.

The tool looks for tell-take artefacts around the eyes or mouth

Video Authenticator analyses videos and gives them a “confidence score” measuring how likely they are to have been manipulated or altered.

“In the case of a video, it can provide this percentage in real-time on each frame as the video plays,” Microsoft say. “It works by detecting the blending boundary of the deepfake and subtle fading or greyscale elements that might not be detectable by the human eye.”

Microsoft accepts that while Video Authenticator might be able to prevent US voters from believing that Joe Biden has been caught on CCTV throwing Molotov Cocktails or Donald Trump has been caught kissing Russian leader Vladimir Putin, the tool can only provide one small victory in an ongoing war against the advance of AI fakery.

It's not foolproof, but it's certainly an improvement on believing everything you see

Because AI is always learning, the company says, sooner or later a better weapon against deep fakes will be needed.

“We expect that methods for generating synthetic media will continue to grow in sophistication,” an official Microsoft blog post says. “As all AI detection methods have rates of failure, we have to understand and be ready to respond to deepfakes that slip through detection methods.

“However,” the post continued, “in the short run, such as the upcoming U.S. election, advanced detection technologies can be a useful tool to help discerning users identify deepfakes.”

We're used to spotting faked still images, but moving video makes trickery harder to spot

Microsoft says it will provide Video Authenticator access to both media outlets and political campaigners in an effort to keep the November elections free from AI trickery.

But in the longer term, the company says: “we must seek stronger methods for maintaining and certifying the authenticity of news articles and other media.”