Your Next Recruiter Could Be an Algorithm

Entelo’s database contains profiles of more than 500 million workers. Odds are, you’re one of them.

I am. My name, photograph and job title are in the system, although I didn’t provide any of this information directly to Entelo, a recruitment software company whose existence I only recently discovered.

Much of the data in my profile is readily available on LinkedIn and other websites I use to showcase my professional experience. But some of the insights dig deeper. For example, by estimating the turnover rate at my current company and noting my impending work anniversary, Entelo predicts that I am “likely to move.”

That’s a signal to Entelo’s recruiter clients who might want to hire a writer that I could be open to a job offer — and that now is the right time to strike.

Machine learning is remaking the hiring process. Already, algorithms attempt to predict how long job seekers are likely to stay in particular roles, recruitment chatbots converse with candidates via text message and video software measures the micro-expressions that cross people’s faces during recorded interviews.

Billed somewhat dubiously as “neuroscience” and “artificial intelligence,” these tools rely on statistics, algorithms, machine learning and psychology. Some promise to discern not only a worker’s skills but also interpret her character, divining whether she’s smart, hard-working, brave or ambitious.

To a job seeker, this technology may seem as inscrutable as magic. Accordingly, skeptics abound. Worker advocates worry about its fairness. Privacy hounds fret about its power to amass personal data. And scientists question whether it’s reliable and valid.

“Just because your decisions involve data doesn’t mean they’re more objective,” says Solon Barocas, assistant professor of information science at Cornell University. “The idea you would be able to infer personality traits that are genuinely relevant to hiring decisions from video interviews seems suspect to me.”

But employers are determined to try. Workers would be wise to prepare for the new ways data is being used to place them in jobs — or block them from opportunities.

Job Seeking Worker

Ten years ago, entry- and mid-level workers were unlikely to draw notice from headhunters, who reserved their energy mostly for executives. But over the past decade, recruiters have started “moving progressively down the stack,” using technology to target workers at less senior levels who may be open to new employment opportunities, says Jon Bischke, founder and CEO of Entelo, which has 700 corporate customers.

“We feel the world is shifting from one where people find jobs to one where jobs find people,” he says. “Having a more robust, up-to-date web footprint, you’re more likely to be found.”

This idea inspired Entelo and several other companies, like the start-up-focused Riviera Partners, to build worker-profile databases that employers pay to access and look for people who might be good candidates for particular roles. To build these profiles, algorithms scour the internet for information that matches jobs with workers in a manner allegedly more sophisticated than relying on resume keywords.

For example, for a graphic designer, Entelo’s systems might scrape thumbnails of her art from her online portfolio, use her employment history to conclude what size companies she prefers to work for and how frequently she likes to change jobs, draw on compensation benchmark data to estimate her salary requirements and hunt for any other tidbits that “give the full picture of an individual,” Bischke says.

Workers can’t directly alter the pictures these companies paint of them. But they may be able to influence their profiles by being savvy about what information they share online, effectively changing the bait they leave out for hungry recruiters.

“If you have a certification or a special skill, you want to make sure that certification, that skill, shows up online so people can find you,” Bischke says.

[Read: How Workers Can Adapt to the Technology Revolution.]

Waiting for Work

Making matches is the first challenge. The second is getting candidates to pay attention to unsolicited job offers.

After selecting workers who fit an employer’s criteria, Entelo draws on millions of data points about the factors that make someone more or less likely to open an email, such as the time of day it’s sent and the job title of the sender, to contact workers in precisely the right ways on behalf of recruiters. The technology learns from each transaction, using the results to improve its future performance. With each offer email opened and worker hired, the machine gets smarter.

Although such systems may improve efficiency, they also make workers more passive in the hiring process. In a future where people wait for algorithms to screen them for jobs, they may never know if they’re being considered or passed up, and on what grounds, leaving little opportunity to collect evidence of discrimination.

“Traditionally, you were called into an interview to interact face-to-face and could get the sense that maybe you were being assessed unfairly. A lot of case law involved these stray remarks,” Barocas says. “All that is missing now. We have no direct interaction with anyone to arouse suspicion.”

Questions about worker agency and fairness are important for companies like Entelo to consider, Bischke acknowledges.

“It’s really important that we are held accountable as software vendors to be able to show the work we’re doing is not leaving people behind,” he says. “It’s something we all need to be scrutinizing.”

To that end, Entelo enables its clients to run recruitment searches that hide information that has traditionally been used to discriminate against certain job seekers, such as their names, genders, graduation years, alma maters, images and gaps in employment history.

“We’re trying to remove information that would lead someone to be unfairly biased,” Bischke says.

[See: 9 Common First-Job Mistakes.]

Grappling With Bias

Indeed, some companies tout their offerings as cures for the intractable ills of employment discrimination. Their mantra: hiring based on merit, not background. Their methods: digital games, recorded interviews and tests that assess applicants’ attitudes.

One such business was founded by a man frustrated that the college he attended didn’t impress finance companies that hire heavily among Ivy League alumni. Called HireVue, the video platform presents all people who apply for a particular job with an identical set of interview questions designed by a team of psychologists to elicit responses that predict success or failure in the role.

The inquiries don’t focus on academic pedigrees or grade point averages. Instead, they’re based on criteria provided by hiring managers and analysis of the traits exhibited by the employer’s top performers.

After job seekers record their answers, HireVue’s system sorts through the responses, analyzing thousands of data points stemming from what candidates said, how they said it and the personality traits they exhibited while speaking. Algorithms search for signs of conscientiousness, problem-solving skills and flexibility, based on tiny tells in word choice and facial expression that data scientists have taught it to consider.

The company’s philosophy is that “everyone should have equal access to jobs,” says Nathan Mondragon, chief IO psychologist at HireVue.

Few would disagree with that sentiment. But there’s little consensus about whether algorithms can realize the vision.

Using structured questions does help eliminate bias that can arise in traditional interviews when recruiters alter their inquiries based on candidates’ ethnicities and genders, according to studies published in Journal of Personnel Psychology and Psychological Science. And HireVue’s system purposely disregards irrelevant information, like a candidate’s age or outfit, that can distract human recruiters from recognizing talent, Mondragon says.

But bias can be baked into algorithms in subtler ways, too. Machine learning is only as smart as the data set it emulates. Using a company’s top performers as models for hiring risks excluding people who have always been underrepresented in that organization, be they women, people of color or people with disabilities.

“The training data really matters enormously to this question of bias,” says Barocas, who has testified about algorithms and hiring discrimination before the U.S. Equal Employment Opportunity Commission. “Whatever patterns are present will dictate the kind of decisions about who you will hire in the future. You’re likely to replicate those exact same problems.”

For its part, HireVue “goes to great lengths” to assess its clients’ possible biases before building each customized artificial intelligence system, Mondragon says, and accounts for differences in job candidates’ word choice and level of eye contact that may stem from culture or gender, not ability. If a candidate has a speech impediment that interferes with data collection, the system is designed not to reject his video but instead flag it for a human to review.

New Tools Lack Evidence

Machine-learning hiring practices are in their nascency, and scientists caution employers and workers against investing too heavily in those that haven’t offered strong evidence of efficacy.

“When I have looked at technical reports for these human resources big data companies, they provide little information and data-driven findings about the reliability, validity and fairness of their measures,” says Fred Oswald, psychology professor at Rice University and former president of the Society for Industrial and Organizational Psychology.

Proof is hard to come by. Fundamentally, it’s just plain difficult to identify exactly what traits make an employee successful in the first place, Oswald says.

And it’s hard to design assessments that accurately measure those traits. After all, a typical annual performance review is hardly an objective exercise; the results are just as likely to reveal a manager’s personal preferences as a worker’s talent, Barocas says.

In the same way, a poorly designed digital game intended to test a job candidate’s attention to detail, honesty or risk tolerance may actually measure something totally unrelated, such as hand-eye coordination.

“Where’s the evidence that I’m actually measuring teamwork in a game?” Oswald says. “If it’s correlated with video-game experience, that would be a problem.”

[Read: Hot New-Collar Jobs and the Skills You Need to Get Them.]

Shifting Search Tactics

The dearth of evidence hasn’t stopped companies from adopting these tools. While academics wait for convincing research, savvy job seekers may need to adapt.

Luckily for workers, algorithms aren’t the purview only of employers. Employees, too, can now access data tools intended to aid their job searches, although they should be similarly cautious about systems that don’t back their claims with proof.

Using processes familiar to online daters and ride-share customers, platform Tilr asks gig workers to indicate their work skills via smartphone swipes, then matches them with short-term job opportunities. It takes a cut of users’ hourly rates and offers them benefits normally not available to contract workers.

Another company, TalentWorks, scours online job postings to identify full-time opportunities that match users’ qualifications and interests, delivering about 25 each day per person. It draws on statistical analysis to “optimize” job seekers’ resumes, recommending they delete objective statements, leave off work experience that lasted less than six months and hide their graduation years if they’re older than 35. Then, it auto-fills job applications and submits materials at times it predicts hiring managers will be most receptive, such as Mondays and mornings.

Worker-focused tools tend to be relatively pragmatic, concerned less about revealing hidden character traits than simply reducing the “pain” involved in hiring, says Kushal Chakrabarti, founder and CEO of TalentWorks.

“There’s so many little nuances that go into the job search and whether you’re going to be successful or not,” he explains. “There’s a lot of things we can solve purely through numbers and machine learning.”

Through “smart” hiring systems differ in their particulars, a few common lessons have emerged, experts say. Workers should read directions carefully and take advantage of any practice opportunities the platforms provide. They should treat recorded interviews as seriously as in-person conversations, peppering their responses with key phrases that match job descriptions.

And they should adjust to the fact that most algorithm systems use skills as currency, imbuing technical and interpersonal details with more value than previous titles, years of experience or academic credentials.

“I would encourage job seekers, even if they’re using a static resume, to really focus on highlighting their skills,” says Carisa Miklusak, CEO of Tilr. “The power of skills for a job seeker are incredible.”

More from U.S. News

The 25 Best Jobs of 2018

7 Secret Opportunities You’re Missing at Work

How to Quit Your Job

Your Next Recruiter Could Be an Algorithm originally appeared on usnews.com

Federal News Network Logo
Log in to your WTOP account for notifications and alerts customized for you.

Sign up