Trusting the Algorithm? Rethinking AI in Frontline Recruiting    

July 22, 2025

In today’s digital hiring landscape, Applicant Tracking Systems (ATS), bots and A.I. have become the industry standard, and are the new gatekeepers of job opportunities. Designed to streamline the recruitment process, these systems and tools sift through thousands of résumés using keywords, filters, and provide multilayer automations. They’ve become not only tools, but a lifeline to Human Resources teams stretched thin. In the past 3 years companies have reduced their average human resources staff by more than 6%, with further reduction on the horizon. In some ways the application of AI enabled recruitment tools has been a self-fulfilling prophecy. The more they’re utilized the less necessary human administrators look; as result positions are eliminated and usage escalates from basic utilization to total reliance, to reckless abandonment of human beings in human management. What’s more, now that we’ve seen this pattern emerge, we’re discovering multiple hidden costs. 

All Horses, No Zebras 

As the saying goes, when you hear hooves, think horses, not zebras. Within an A.I. enabled screeners, past decisions become the determiner of future preferences. As such, our own conceptualization of what individual success in a role determines what the filters grade positively, and thus, a confirmation bias is born. 

So once you have a profile for a horse determined, the system will continue to spit out more horses. A zebra however, while sharing many attributes with a horse, will be filtered out as it also brings its own unique stripes. As a result, a company that only searches for the ordinary, will never encounter the extraordinary. 

Discrimination and Litigation 

One of the most concerning issues is the potential for discrimination baked into the algorithms. It’s one thing to inadvertently miss out on top talent, it’s another to accidentally disregard total population groups. Many screeners filter applicants based on rigid criteria—gaps in employment, lack of specific buzzwords, or nontraditional career paths—often penalizing those who may already face systemic barriers in the job market, such as older workers, younger workers, people with disabilities, or those from underrepresented communities. Recently Derek Mobley, an IT professional based out of North Carolina, went toe to toe with the filters, and discovered a trend between the ATS system used, and the frequent rejection of his resume. As a result of his findings he’s pursued litigation against these ATS software companies, to increase regulation and accountability of their products. 

Lost Story Telling 

If there is one human trait that transcends all cultures, creeds, and values it is our use of allegory. From cave painting to the board room, humans are telling stories not only to convey information, but share what we value as individuals and society. When real people are involved in reviewing résumés, they notice the stories bots don’t— often exhibiting passion, resilience, and transferable skills. A candidate who spent a year caring for a family member or launching a small business may bring a wealth of soft skills and dedication. These are the kinds of traits, skills and attributes that don’t show up in keyword searches but are vital to a thriving workplace. 

The best hiring decisions are made not just by scanning résumés but by listening to stories. Real conversations uncover drive, creativity, and cultural fit in ways no algorithm can. Companies that take time to connect with candidates personally often find better long-term hires, more diverse teams, and stronger workplace morale. Technology can be a powerful ally in recruiting—but only when balanced with humanity. It’s time to reintroduce the personal touch into hiring and remember that behind every résumé is a person with potential. Let’s not let good people get filtered out.

©2026 Frank Resource. All rights reserved.