Skip to Main Content

You Robot? How to Defeat a Deepfake

Shutterstock 1689401386 Video Conference Deepfake 1168X660

Another tale from the “We Are Officially in the Future” files: You may have seen the story a few weeks back about an AI deepfake that cost a financial services firm $25 million. You probably didn’t see any tips on how to keep that from happening to your bank.  We’ve got some of those below. But first, a brief recap for those who are unfamiliar. 

The place: Hong Kong 

The time: The present (although ... it sure does feel like the future sometimes).  

The setting: A video conference at a multinational firm. On the call are a finance staff member at the firm, the firm’s CFO, and several of the finance worker’s colleagues. Only the CFO and the colleagues are not who – or what - they appear to be: They are AI deepfakes that, as deepfakes do, look and sound exactly like the people they are impersonating.   

You see where this is going: The finance worker was asked to carry out a transaction that wound up enriching the fraudsters behind the scheme.  

From CNN: “Believing everyone else on the call was real—(INSIDER INTERJECTION: Such a 2019-era assumption)—the worker agreed to remit a total of $200 million Hong Kong dollars—about $25.6 million.” 

How to Detect a Deepfake 

Ars Technica, in covering the exploit, suggests the following:   

  • Use public key cryptography. If employees sign public keys at in-person meetings, “those signed keys can be used to authenticate parties” in video calls.  
  • Train workers. Employees can expose deepfakes by asking validating questions of suspected impostors, or by simply asking them to turn their head to the side. Fun fact: A weakness of some deepfakes, so far, is that faces look distorted when turned in profile. (For more on that, see examples in this piece from The Register.)  

Beyond that, the University of Miami’s Department of Information Technology says that other clues could include: 

  • Skin that appears too smooth or wrinkly.  
  • Shadows or glare that seem unusual. (“Deepfakes may fail to fully represent the natural physics of lighting.”) 
  • Video that is not in sync with audio.  
  • Facial hair that looks unnatural. 

And be careful. As ING coaches in an article on its website, “the faces you trust could betray you.”