The Rise of Fake Job Applicants
Apr 14, 2025
You've probably heard of catfishing in online dating apps, but it's happening in job applications, too.
Hiring managers today face a strange new problem, the rise of fake applicants. These aren't just people exaggerating their skills. Some use AI filters to change their appearance during interviews, powered by deepfake technology, and even stolen identities, to trick companies into giving them remote jobs. It sounds like a movie plot, but it's happening increasingly, prompting recruiting firms to be extra vigilant.
AI-Generated Faces and Fake Resumes Are Slipping Through
Dawid Moczadlo, co-founder of Vidoc Security Lab, shared on LinkedIn how he interviewed someone online who looked perfect for the job until they realized it wasn't even a real person. The candidate was found to have used software that changed his appearance on camera, and his whole identity was fake. This was the second time it had happened to them. Now, they fly candidates in for a full day of in-person interviews, covering all travel costs to make sure the applicant is who they say they are.
If they could fool Moczaldo, who is already a security expert, how much more would it have been for regular people?
Pindrop's CEO, Vijay Balasubramaniyan, another security expert also revealed that one out of every 343 job applicants is actually from North Korea. He shared their encounter with a fake candidate they nicknamed "Ivan X," who used deepfake tech to mask his identity. Once hired, these workers would allegedly ask for company laptops to be sent to a different address, usually claiming an urgent matter that caused the sudden location change. But that address often leads to a so-called "laptop farm" managed by a resident in the US connected to North Koreans, allowing them to access the device remotely.
Even Background Checks Can Be Fooled
At KnowBe4, a cybersecurity company, everything looked normal when they hired a new software engineer. The resume checked out, the background check was clean, and video interviews matched the profile photo.
However, as soon as the device was activated, the company's Endpoint Detection and Response (EDR) software flagged suspicious behavior. Malware started uploading as soon as the company laptop was delivered. The person had used a stolen identity from a real US citizen. The AI-enhanced photo helped them pass as legitimate. KnowBe4 eventually reported it to the FBI.
Why Is This Happening?
In many cases, it's all about money. Some people join "candidate farms" in places across the US borders, where skilled teams work behind the scenes through undisclosed outsourcing. One person with strong English skills might handle the interview, while others do the work quietly for much less pay. It's a way to "outsource" jobs without telling the company, which means the employer thinks they hired one person but got a whole secret team, posing a risk to the company's security.
One anonymous internet user summed it up like this: "If the job can be done remotely, then there's someone in a sweatshop/call center serving as The Face when necessary, and the work gets farmed out to teams working for slave wages. It's a form of involuntary outsourcing by not letting the hiring companies know the job has been outsourced."
In more serious cases, like those involving North Korean workers, the motive may go beyond money. Some believe it's a form of corporate espionage, spying on companies and allegedly stealing data to benefit the regime back home.
How Are Companies Fighting Back?
Companies are scrambling to find ways to stop this. A recruiter said, "We now track the IP addresses of video call participants to confirm they're in the US during the interview. If they aren't, they're asked specific follow-up questions. If their answers don't make sense, like saying they're "on vacation," they're immediately disqualified.
Other hiring teams require applicants to come in person at some point during the process to deter these fake applicants. "Just adding an in-person meeting as a requirement for the application can scare off a lot of fakes. When told they'll need to show up, many applicants suddenly vanish."
Some interviewers have even started asking unusual questions to catch impostors off guard. At a cybersecurity conference, Adam Meyers, Senior VP of counter adversary operations at CrowdStrike, said he asks, "How fat is Kim Jong Un?" during interviews. It sounds odd, but it works. If the applicant is secretly from North Korea, they're likely to hang up or refuse to answer, fearing backlash from their government.
Why This Is So Alarming
This whole trend is worrying for several reasons. First, when fake candidates slip through using AI-generated faces or stolen identities, they take away opportunities from genuine job seekers who worked hard to earn their skills and experience.
Second, it poses a serious security risk to companies. Take KnowBe4's case, they hired someone who had passed every standard step of the hiring process. But the moment the new hire received the company device, it began uploading malware. That's not just a red flag, it's an emergency.
These types of fake hires aren't just after a paycheck. In more alarming cases, they're part of state-sponsored operations, like the case of Pindrop's "Ivan X," a deepfake job applicant allegedly from North Korea.
There's also the issue of undisclosed outsourcing. This deception means that companies lose control over who is handling their code, customer data, or internal tools. It's like hiring a single contractor and ending up with an entire unverified third-party operation doing the work behind the scenes.
Because of these risks, companies rethink how they approach remote hiring altogether. Some are dialing back remote roles, insisting on in-person interviews, or adding stricter layers of verification. What was once praised as the flexible future of work is now under scrutiny. If companies can't trust who's on the other side of the screen, they may start to treat remote workers with more suspicion or stop hiring remotely eventually.
Track Your Applicants with a Modern ATS
With fake candidates rising from AI-altered photos to deepfake interviews and stolen identities, recruiters need more than just a resume to rely on. The risk of placing the wrong person is even more crucial for executive search and contingent firms.
Stardex AI helps you spot red flags early on.
Stardex is an AI-powered applicant tracking system built for high-stakes recruiting. When a fresh lead enters your pipeline, Stardex can generate instant summaries about the candidate using publicly available data like their LinkedIn profile. And if someone claims to be a VP of Marketing, Stardex can quickly break down their stated experience, role history, and industry background, giving you a clear snapshot before the first call.
As new candidates come in, Stardex gives your team tools to collaborate on every lead, track each touchpoint, and log real-time gut-checks. It makes it easy to flag suspicious behavior, like candidates from your ATS who refuse to turn on their camera during virtual interviews or avoid in-person meetings without clear reasons.
Book a demo with Stardex and see how this ATS helps recruiters catch red flags before the offer letter goes out.