Skip to main content
Line of diverse robots

What does diversity in AI mean for diversity in recruitment?

By Chatter, on 14th May 2018

This article is over a year old. We've probably got some more recent, relevant content than this - why not check out the latest on what we think?

If you’ve been to any HR or recruitment conference in the past six months, or scrolled through LinkedIn at any point ever, you won’t have been able to avoid the rise of AI, big data and bots. People across all industries are worried that they’ll soon be replaced by AI, bots or robotics. And while bots probably aren’t going to write the next smash hit Harry Potter novel (J.K. Rowling is pretty safe by the looks of it ) technology is going to continue to change the recruitment landscape. Great insight, I know.

But the people who are creating this technology are human. And given how they’re going to change the way people across all industries work it’s important to consider how these products are made, who is making them and how the lack of diversity in one industry might impact all of us.

The evidence

There’s already a whole host of concerning trends out there. If you do a Google image search for ‘doctor’ or ‘engineer’, the results are incredibly white and male. Voice recognition software has historically struggled if you have an accent or are female. Software used to predict criminal behaviour in the US may well be biased against black people. Skewed data sets feeding healthcare AI systems mean they’re likely to give better advice to white men. I could go on.  

AI systems often function as black boxes, which means it’s hard to identify any inequality, bias, or discrimination when a system makes any particular decision. This was the case when a university study found significantly fewer women than men were shown online ads promising them help getting jobs paying more than $200,000. It may have been how the advertisers had set up their ads, but equally it could be the unintended consequence of machine learning algorithms that work the recommendation engines. There was just no way of knowing.

All of this stems from a lack of diversity across software development and technology. Too many people share too similar a background. Being typically designed and built from the perspective of white men, the unintended consequence is that they’ll favour white men. But for such a revolutionary technology that is going to change the world so dramatically, this lack of diversity is something we should all be concerned about and shouldn’t accept as the norm. And if as predicted these systems will be making hiring and firing decisions, or even just recommendations, it’s possible that diversity issues in one industry will snowball across many more.

So what can you do as a recruitment manager, a HR adviser, or a CEO when looking at a system which uses AI, machine learning or big data?

Always ask what’s being done to assess, measure or correct for any bias in the system your investing in and implementing. If you don’t get a decent answer, keep on digging. Ask about the diversity in their development teams. Make sure the data sets you start out with are balanced and fair.

And what about the hiring process in general? 

It’s not all on the software you’re buying in.

Look at your attraction campaigns to appeal to a diverse audience - if you’re targeting Russell Group universities for a graduate campaign, you’ve already selected an unrepresentative audience

Review your job ads using our Readability and gender bias tool. Research has found that having feminine-coded ads will only have a slight effect on how appealing the job advertised is to men, and will encourage women applicants. But ads that have more stereotypically masculine words will risk putting women off applying.

And make sure you keep offline checks and balances in the recruitment process to make sure you’re seeing a range of candidates. Name, age and institution blind recruiting can all help reduce unconscious bias from both recruiters and hiring managers. Look at having a ‘Rooney Rule’, or diverse interview panels to avoid ‘groupthink’. 

Society and businesses benefit when we’ve got a diverse workforce. And as we move towards the age of AI, it’s vital that we make sure everyone gets a fair shot.

More thoughts from us

See more articles

Get in touch

We believe that all people have talent. And we think that every business and organisation can do amazing things when they engage with that talent in the right way. So, if you’d like to talk about what we can do for you, your business, and your talent, we’d love a chat. And if you’d like to stay up to date with what we think, create and do then sign up for our newsletter, too.

Please see our Privacy Policy for details on how we use your information.

Thanks for getting in touch!

Your message has been sent on to the team, one of us will be in touch as soon as we can.

In the meantime, if you need us urgently, why not give us a call on +44 113 524 0390.

There was an error!

There was an error sending your message, please try again later. If the problem persists, feel free to give us a call on +44 113 524 0390.

Leeds Office

Castleton Mill, Castleton Close
Leeds, LS12 2DS.

+44 113 524 0390

London Office

5th Floor, 167-169 Great Portland Street,
London, W1W 5PF.

+44 20 4574 6205

Chatter Communications Ltd. is a company registered in England and Wales. Registered number: 07550917.

Sign up to our newsletter