Connect with us


Technology Is Improving Hiring Practices

Last updated by


Big Data on Keyboard

When you look around your workplace, do you see a lot of people who look similar or come from similar social backgrounds? If so, there could be a problem with your company’s recruitment strategy.

While teams like this can crop up randomly, they are often an indicator of bias in the hiring process, which can happen unconsciously even when all involved have the best intentions. That bias has damaging effects on society and the wider economy and could put your company at a disadvantage.

Thankfully, a new generation of technology has made it easier than ever to remedy the situation.

How Does Unconscious Bias Happen?

Very few people want to be prejudiced. When you’re hiring you need to find the best person for the job. Prejudice can creep in however, in all sorts of subtle ways. We are all born with an instinctive bias towards people who remind us of ourselves, something that can be unlearnt but that often influences our decisions in subtle ways.

We might not be aware, for instance, that we have racist assumptions about what looks professional or sexist assumptions about who will be best suited to particular roles.

We might be quick to dismiss people with foreign-sounding names, unconsciously thinking of them as cultural outsiders who won’t fit in with the team or imagine that an LGBT person will have a messy social life that could interfere with work.

Often these biases are so deeply ingrained, due to the way we’ve grown up, that we don’t notice them – and as a result, we turn away people who could be perfect for the job, while some highly capable people struggle to find jobs anywhere.

Looking through a big data lens

How do we know that bias in the hiring process occurs? Not so long ago, research was resource-heavy and difficult to do. Now, increases in computing power are changing that. It has become much easier to look at really big data sets, whether we’re considering a group of companies or just one large one. This means we can see that certain groups of people find it harder to secure work than others, despite having equally good qualifications.

We can also pinpoint the places where hiring practices fail to correspond to corporate equality policies, so we can see the gap between what people say they want to achieve – often sincerely – and what actually happens. This can be addressed to an extent through training, but increasingly employers are turning to a new, technological approach to solving the problem.

AI candidate matching

If humans are innately biased, is there a way to take them out of the equation? Joanna Riley thinks so. Her company has developed artificial intelligence software (AI) which sorts through potential candidates in place of a human having to do so, assessing them based on their qualifications and skills as well as looking at other factors that have been found to be a good match for the position an employer is seeking to fill.

AIs can work much more quickly than humans which also means they can sift through vast numbers of potential candidates rather than just looking at those who have directly applied. That improves the likely quality of shortlisted individuals overall and significantly reduces the risk that talented people will be ruled out for reasons irrelevant to what they could bring to the organization.

A learning process

The first generation of AIs to engage in this sort of work came in for some criticism, and rightly so. Although they could be blocked from considering factors like age or sex, they could still home in on related factors, for instance by discriminating against people who went to girls’ schools, because they were basing their choices on biased choices that had been made by humans in the past.

Today’s AIs have been heavily refined to reduce that kind of risk. A new AI is like a child – it can only make decisions based on what it learns as it navigates the world. Good parenting, teaching it to steer clear of certain kinds of thinking or influence, can enable it to make much better choices.

Why it matters

Selecting new employees in an unbiased way – or as close to that as possible – not only increases the size of the talent pool available to an organization it also leads to the development of more diverse business teams, which have been shown to correlate with business success.

Researchers believe this is probably because when people come from different backgrounds they have many different ways of looking at problems and coming up with solutions. Working together, they can make business more adaptable and innovative. With this in mind, it’s clear that cutting out unconscious bias in the hiring process has benefits for all concerned.