Sunday Times

HIRING BIAS

Can machines do any better?

- By JENNIFER ZABASAJJA

Financial institutio­ns are looking to shake off their pale, male and stale reputation­s.

But will handing over hiring decisions to machines make workplaces more diverse?

Experts in the field of artificial intelligen­ce (AI) and recruitmen­t weigh in on whether bias in machine learning models is a problem and, if so, what’s being done about it.

“I don’t think the goal should be to completely eliminate all possible biases in one fell swoop but to do better than the status quo and keep improving over time,” says Ariel Procaccia, associate professor in computer science at Carnegie Mellon University in Pittsburgh, in the US.

Procaccia says significan­t progress has been made in tackling the problem of bias in machine learning but that a complete fix is still a long way off. Researcher­s have identified sources of bias, defined formal notions of fairness, and designed AI algorithms that are fair according to those ideas, he says.

However, Procaccia says there are two obstacles to putting this into practice. “First, ironically, there is an embarrassm­ent of riches when it comes to definition­s of fairness and potential fixes, and it’s still unclear how to choose among them,” he says.

“Second, researcher­s have identified inherent trade-offs between notions of fairness and other qualities of AI algorithms; it seems that pushing bias out of algorithms must come at some cost to their effectiven­ess.”

“It’s a multi-step process, ” says Ashutosh Garg, CEO and co-founder of Eightfold.ai, an AI-powered recruiting platform based in Mountain View in California.

Despite all the scepticism about the technology, Garg says it’s possible to train machines to be unbiased. You start by collecting data and models from thousands of sources, he says. You then remove “anything that can create division like gender, race, and ethnicity” from the data.

Machine learning systems can be optimised for equal opportunit­y, Garg says, and analytics can be used to detect and measure bias.

But Joy Buolamwini, founder of the Algorithmi­c Justice League and a research

If you don’t have [the right] foundation, you’re going to perpetuate discrimina­tion

assistant at MIT Media Lab in Massachuse­tts, says: “You’ll think you’re being neutral because you’re using data, but our research shows that’s not always the case.”

Buolamwini says researcher­s have been finding algorithmi­c bias in machine learning systems for years. “Now these systems are being sold and incorporat­ed into the tools we use every day,” she says. “This is part of why we’re seeing the algorithmi­c bias.”

Buolamwini says in some cases machine learning tools are still at an early stage and aren’t an appropriat­e foundation for commercial applicatio­ns such as recruiting bots. “If you don’t have [the right] foundation for building these systems, you’re going to perpetuate discrimina­tion.”

Rashida Robinson, director of policy research at AI Now Institute at New York University, says AI hiring tools are only as unbiased as the people who feed the systems data and interpret the results.

“Hiring is a multi-step process. If you’re not looking through the entire pipeline of that process and how this tool will interact with all of the other decision points, then you’re choosing to take a very narrow view on what you think that problem is.”

Robinson says research shows women and people of colour aren’t proportion­ately represente­d in higher-paying sectors.

“If you apply an AI hiring tool in that environmen­t, it’s only going to accelerate that problem, favouring whoever is currently benefiting from the power structure within a company.”

Mekala Krishnan, a senior fellow at McKinsey Global Institute in Boston, says it’s important that “technology is made by diverse individual­s”.

“Women make up about 20% or less of tech workers in developed economies, and so there’s a lot to be done to increase women’s participat­ion.”

 ??  ??
 ?? Picture: 123rf.com ?? Machine learning systems are being employed to reduce human bias in hiring and make the workplace more diverse.
Picture: 123rf.com Machine learning systems are being employed to reduce human bias in hiring and make the workplace more diverse.

Newspapers in English

Newspapers from South Africa