How to Spot a Deepfake; It’s Easy

10769559679?profile=RESIZE_400xWith criminals beginning to use deepfake video technology[1] to spoof an identity in live online job interviews, security researchers have highlighted one simple way to spot a deepfake: just ask the person to turn their face sideways.  The reason for this as a potential handy authentication check is that deepfake AI models, while good at recreating front-on views of a person's face are not adequate for presenting side-on or profile views like those seen in a mug shot.

Camera apps have become increasingly sophisticated. Users can elongate legs, remove pimples, add on animal ears, and now, some can even create false videos that look very real.  The technology used to create such digital content has quickly become available to the public, and they are called “deepfakes.”  Deepfakes refer to manipulated videos, or other digital representations produced by sophisticated artificial intelligence, that yield fabricated images and sounds that appear to be real.[2]

Metaphysics.ai highlights the instability of recreating full 90° profile views in live deepfake videos, making the side profile check a simple and effective authentication procedure for companies conducting video-based online job interviews.  Deepfakes or synthetic AI-enabled recreations of audio, image, and video content of humans have been on the radar as a potential identity threat for several years.10769562065?profile=RESIZE_400x

In June 2022, the Federal Bureau of Investigation (FBI) warned it had seen an uptick in scammers using deepfake audio and video when participating in online job interviews, which became more widely used in the pandemic.  The FBI noted that deepfake candidates targeted tech vacancies because the roles would give the attacker access to corporate IT databases, private customer data, and proprietary information.  The FBI warned that video participants could spot a deepfake when coughing, sneezing, or other sounds do not align with what is in the video.  The side profile check could be a quick and easy-to-follow way for humans to check before beginning an online video meeting.

Remember when we looked at photos that may have been faked?  We inspected the size, colors, and shadows to see if the presented image could have been faked.  Most deepfakes fail when the head reaches 90° and reveal elements of the person's actual side profile.  The profile view recreation fails because of a lack of good-quality training data about the profile, requiring the deepfake model to invent or "repaint" what is missing.

Part of the problem is that deepfake software needs to detect landmarks on a person's face to recreate a face.  When turned side on, the algorithms only have half the landmarks available for detection compared to the front-on view.  A major weakness in using side profile video to recreate a face are the limits of 2D-based facial alignment algorithms and a plain lack of profile data for most people except Hollywood stars. 

Arguing the case for using the side profile for authentication in live video meetings, experts point out there will likely be a persistent shortage of side view training data for average people.  There is little demand for stock photos of profile headshots because they are often not flattering, and no motivation for photographers to supply them either since they offer little emotional insight into a face.  That insufficient available data makes it difficult to obtain a range of profile images on non-celebrities that is diverse and extensive enough to train a deepfake model to reproduce profile views convincingly.  This weakness in deepfakes offers a potential way of uncovering 'simulated' correspondents in live video calls, recently classified as an emergent risk by the FBI: if you suspect that the person you are talking to might be a 'deepfake clone,' you could ask them to turn sideways for more than a second or two and see if their appearance still convinces you.

Sensity https://sensity.ai is a developer of liveness detection and deepfake detection software.  In May 2022, the company reported that nine of the ten widely adopted biometric verification systems used in financial services for Know Your Customer (KYC) compliance were severely vulnerable to deepfake 'face swap' attacks.  It found that commonly used liveness tests involving a person looking into a camera on a connected device were also easily duped by deepfakes.  Liveness tests require the person to move their head left and right and smile.  The deepfakes Sensity used involved the would-be fraudster moving their head left and right, but the video shows they stop turning their head before it reaches 90°.  Sensity's CEO and Chief Scientist confirmed they did not use full 90° profile views in their tests.  "Lateral views of people's faces, when used as a form of identity verification, may provide additional protection against deepfakes.  As pointed out, the lack of widely available profile view data makes the training of deepfake detector very challenging."

According to fraud investigators, another useful way to rattle a live deepfake model is to ask the video participant to wave their hands in front of their face.  It disrupts the model and reveals latency and quality issues with the superimposition over the deepfake face.         

Red Sky Alliance is a Cyber Threat Analysis and Intelligence Service organization.     For questions, comments, or assistance, please contact the office directly at 1-844-492-7225, or feedback@wapacklabs. com    

 

Weekly Cyber Intelligence Briefings:

Weekly Cyber Intelligence Briefings:

REDSHORTS - Weekly Cyber Intelligence Briefings

https://attendee.gotowebinar.com/register/5504229295967742989

 

[1] an image or recording that has been convincingly altered and manipulated to misrepresent someone as doing or saying something that was not actually done or said

[2] https://www.zdnet.com/article/digital-transformation-top-5-skills-you-need-to-succeed/

E-mail me when people leave their comments –

You need to be a member of Red Sky Alliance to add comments!