What is Deepfake?
Deepfake is the technique of using artificial intelligence for human image synthesis. An easy way to understand this is pasting someone else’s face onto another in a video. For example, doctoring a news reader’s face onto someone talking about a fake news story – you can see the negative implications.
Last month The Guardian reported on deepfake and showed a video from an interview where the interviewee Bill Hader’s face subtly morphs into Tom Cruise’s whilst he does an impression of him. The creator of the clip has created a number of similar deepfake videos to raise awareness of the potential threat of fake news and sophistication.
This new threat is already becoming available to anyone with access to the Internet. A new deepfake app which has been created in China allows users to place their own face on actors in famous films and TV scenes.
So what’s this got to do with business?
Well, if you take a simpler version of this technology – the ability to mimic someone’s voice over the phone, then there are applications to business.
Say you receive a call from your ‘Finance Director’ asking you to authorise payment to a person or company. Not only may the caller know information about your company already but they now sound like your Finance Director too. Do you question it? There is much research around obedience and authority, the famous one being Milgram’s Obedience study. The general theme being that people will ignore doubts or questions of morality and blindly obey orders of authority. Milgram’s study used members of authority not known by the individuals, so when ordered by someone who you actually know to be your authority, the threat is even stronger.
Who would fall for this?
You may be thinking surely no one would fall for this? You would know to be cautious of processing large payments for someone on the other end of the phone. Well you couldn’t be more wrong. Just yesterday (5th September 2019) Sophos, our security partner, published a story about an occurrence where a company had fallen victim to a deepfake audio scam. In short, an AI-generated voice of a CEO resulted in an employee transferring $243,000 (roughly £197,186) to a scammer.
So what can you do?
With the example given above, the easiest way to ensure legitimacy is to call your Finance Director (NB: Not ring back the number that just rang!). It will only take a few moments and could prevent disastrous consequences. For some businesses it may even be necessary to only accept requests for payments in person. Thankfully you can’t change the face of a real person – well not yet anyway!
Further threat
The advancement of this technology is already branching out to live video streams, which could potentially pose a threat for Skype Video Calls. If you take the above example, but this time you would be actually looking at your ‘Finance Director’. The key message is always to err on the side of caution, especially when dealing with money or personal information. If there is any doubt in your mind of the legitimacy of a request then double check. For the sake of a 30 minute delay to clarify, you could prevent detrimental financial consequences of simply obeying authority.
Summary
At the moment this threat is still relatively new. If you watch an online video or TV news segment it’s unlikely you’d question if it was real or not. This is when a threat like this is most dangerous – when people aren’t privy to it. No doubt in a few years everyone will be aware of deepfake and so everyone will be more cautious. But for now, remain diligent, and if something doesn’t seem quite right – there’s a good chance it isn’t!