
TOPICS:
Are We Now Trading Virtual for Reality?
Sep 05, 2025 / Written by: Gary Isbell
In-Person Interviews Return in an AI-Dominated World
Artificial intelligence (AI) is turning business job interviews upside down. Applicants are taking advantage of remote off-site interviews to engage in cheating. Many companies end up with incompetent employees who harm their bottom lines.
The AI Race in Hiring
Indeed, AI has introduced a new set of challenges to the hiring process. Recruiters and employers no longer use traditional means of personal interviews to judge character and competence. They must now deal with AI-driven deception that uses the advantage of algorithms to outsmart companies and secure high-paying jobs.
While job-seeking tools like AI assistants can level the playing field for less-experienced applicants, their deceptions have become extremely convincing. Employers now face the dual challenge of detecting fraud while adjusting their hiring strategies to benefit from AI’s legitimate advantages. Companies that invest in both human and technological vetting systems are better equipped to identify and retain genuine talent.
Gartner, a research and advisory company for technology firms, foresees a bleak outlook for hiring. The firm projects that 26 percent of applicant profiles will be fake by 2028. This statistic indicates a rising trend of candidates disguising their true skills or identities, using AI to boost their credentials.
AI Tools and Interview Cheating
In recent years, virtual interviews have become common as remote work increases and companies aim to speed up hiring. Such situations offer some candidates the opportunity to use AI to cheat during their virtual interviews.
The rise of scams has sparked a growing arms race of algorithms between candidates and recruiters who both use AI. Applicants use AI to customize their resumes by creating engaging cover letters and automating job applications. Using these tools, an unscrupulous applicant can sometimes submit hundreds of such applications with just a few clicks. This influx overwhelms hiring managers, who, in turn, increasingly rely on AI tools to sort through the applications.
Gartner’s research found that six percent of job seekers admitted to engaging in some form of interview fraud. Common methods include impersonating someone else or having a stand-in appear at in-person interviews.1
The Scourge of Fake Candidate Profiles
Other candidates use more sophisticated methods. Some applicants rely on AI to generate coding solutions off-screen during live interviews where they are presented with technical challenges. This creates significant concerns, especially when filling software engineering roles where troubleshooting and precise coding are crucial.
AI has also enabled con artists to create advanced deepfake videos and audio. These methods allow scammers to impersonate legitimate job candidates. The Federal Bureau of Investigation (FBI) warns that fraudsters are using AI to manipulate their identities. One investigation revealed thousands of North Korean operatives allegedly securing remote tech jobs through fake profiles. The FBI currently has fifteen North Korean IT scammers listed as most wanted.2
Employers Turn to Old-School Methods
Confronted with the growing unreliability of virtual interviews, companies are combating AI deception by shifting back to in-person interactions. Firms like McKinsey, Cisco and Google make these face-to-face encounters part of the interview process. The goal is simple. The employers want to accurately evaluate a candidate’s skills and qualifications without the nearby influence of AI tools or digital deception.
Google, for example, has added in-person rounds for some roles to ensure that candidates meet job requirements. Sundar Pichai, Google’s CEO, shared in a June podcast that in-person interviews help establish whether candidates possess “the fundamentals,” particularly in areas like real-time coding.3
Similarly, after facing issues with candidates lying about their locations or credentials, Cisco has added in-person interviews to its hiring process. Often, simply scheduling an in-person interview is enough to expose fraudsters, who go silent because they cannot make it to the meeting.
New Tech
Even as companies return to traditional hiring methods, they are also improving their processes with other means of verification. Background checks, biometric identification and deepfake detection services are becoming common in some hiring strategies.
Employers are also becoming adept at watching for signs of fraud during virtual interviews. Off-screen whispering, delayed responses or typing noises often indicate an applicant’s use of unauthorized tools.
A Distrustful Hiring Landscape
The widespread use of AI in fraudulent hiring practices increases mistrust between job seekers and employers. This skepticism, in turn, makes it harder for honest candidates to stand out.
The increasing fight against AI fraud requires both vigilance and adaptability. By combining old-fashioned one-on-one meetings with cutting-edge technology, recruiters can strike a balance to assess an applicant’s integrity and ability.
A few years ago, scammers used clumsy scam emails filled with typos and false promises of huge lottery wins. Those methods feel almost nostalgic now.
Today’s AI scams are a much larger nightmare. They’re sophisticated, disturbingly convincing and emotionally manipulative. Detecting and defeating them requires continual vigilance.
Old-fashioned solutions like face-to-face interviews work because integrity and virtue are much harder to fake in person. Generating trust and confidence only comes from actions and a presence that reflect honesty and dependability.
Footnotes: