The FBI said on Tuesday that more and more people are using technology to pose as someone else in an interview.

The FBI said it has received an increase in complaints about people superimposing videos, images, or audio recordings of others onto themselves during live job interviews. The agency said that the complaints were related to remote tech roles that would have given successful candidates access to sensitive data.

The videos could be used for entertainment, but they could also be harmful. Meta removed a video that claimed to show Ukrainian President Volodymyr Zelenskyy ordering Ukrainian forces to lay down their arms.

The harm that private individuals could face from being targeted by deep fakes is equally concerning. The Department of Home Security warned in a report about the use of deepfake technology to harm private individuals who do not command public attention.

Tech jobs have been fraudulent for a long time. Some candidates hire external help to assist them during the interviews in real time, and the trend seems to have gotten worse during the PAIN. In May, recruiters discovered that North Korean scam artists were posing as American job seekers for tech companies.

The FBI is using artificial intelligence to help people get jobs. The FBI didn't specify how many incidents it had recorded.

Anti-deepfake technologies are far from perfect

In 2020, the number of known online deepfake videos reached over 145,000, nine times more than in the previous year.

There are technologies and processes that can be used to remove deep fake videos. A report from Sensity, a threat-intelligence company based in Amsterdam, found that anti-deepfake technologies accepted deepfakes videos as real in almost all instances.

Unusual blinking, soft focus around skin or hair, and unusual lighting are some of the telltale signs of deep fakes.

The FBI offered a tip for spotting voice deep fake technology. The actions and lip movements of the person interviewed on camera do not match the audio of the person speaking. The agency stated that actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.

People who have identified deep fake attempts should report them to the FBI.