Criminals Use Deepfake Videos to Interview for Remote Work

Criminals Use Deepfake Videos to Interview for Remote Work

Security experts are on the alert for the next evolution of social engineering in business settings: deepfake employment interviews. The latest trend offers a glimpse into the future arsenal of criminals who use convincing, faked personae against business users to steal data and commit fraud.

The concern comes following a new advisory this week from the FBI Internet Crime Complaint Center (IC3), which warned of increased activity from fraudsters trying to game the online interview process for remote-work positions. The advisory said that criminals are using a combination of deepfake videos and stolen personal data to misrepresent themselves and gain employment in a range of work-from-home positions that include information technology, computer programming, database maintenance, and software-related job functions.

Federal law-enforcement officials said in the advisory that they’ve received a rash of complaints from businesses.

“In these interviews, the actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking,” the advisory said. “At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.”

The complaints also noted that criminals were using stolen personally identifiable information (PII) in conjunction with these fake videos to better impersonate applicants, with later background checks digging up discrepancies between the individual who interviewed and the identity presented in the application.

Potential Motives of Deepfake Attacks

While the advisory didn’t specify the motives for these attacks, it did note that the positions applied for by these fraudsters were ones with some level of corporate access to sensitive data or systems.

Thus, security experts believe one of the most obvious goals in deepfaking one’s way through a remote interview is to get a criminal into a position to infiltrate an organization for anything from corporate espionage to common theft.

“Notably, some reported positions include access to customer PII, financial data, corporate IT databases and/or proprietary information,” the advisory said.

“A fraudster that hooks a remote job takes several giant steps toward stealing the organization’s data crown jewels or locking them up for ransomware,” says Gil Dabah, co-founder and CEO of Piiano. “Now they are an insider threat and much harder to detect.”

Additionally, short-term impersonation might also be a way for applicants with a “tainted personal profile” to get past security checks, says DJ Sampath, co-founder and CEO of Armorblox. 

“These deepfake profiles are set up to bypass the checks and balances to get through the company’s recruitment policy,” he says.

There’s potential that in addition to getting access for stealing information, foreign actors could be attempting to deepfake their way into US firms to fund other hacking enterprises.

“This FBI security warning is one of many that have been reported by federal agencies in the past several months. Recently, the US Treasury, State Department, and FBI released an official warning indicating that companies must be cautious of North Korean IT workers pretending to be freelance contractors to infiltrate companies and collect revenue for their country,” explains Stuart Wells, CTO of Jumio. “Organizations that unknowingly pay North Korean hackers potentially face legal consequences and violate government sanctions.” Read more: https://bit.ly/3aeMgzI

You can also read this: Cyber-Criminals Smuggle Ukrainian Men Across Border

Leave a Reply

Your email address will not be published.