Photo by Christophe Hautier on Unsplash


The case of biased automated recruitment

After countless scandals about biases in AI (for example, this one, this one and this one), fairness in AI appears as one of the major challenges of the field. However, AI fairness is hard to understand and hard to implement. In this tutorial, we build and analyze a particular use case: automated recruitment, which hit the headlines in the past.

Throughout this tutorial, we are leveraging the AIF360 library by IBM Research.

Let us create a biased dataset from scratch

Say the goal is to recruit profiles on a given job (e.g. data scientist). There are basically two distinct populations: men and women.

We intentionally introduce three different…

Grégoire Martinon

Lead Data Scientist @ Quantmetry

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store