Gender biased algorithmic hiring

First published:

Last Edited:

Number of edits:

Building on the myth of the double blind, Criado Perez1 tackles the next big issue for "artificial intelligence", which is the recruitment bias. Although it could be possible to overcome the limitations (and biases) of CV-screening by a human, algorithms which are mainly done by men will fall pray of their own limitations.

For example, a company call Gild, was using some indicators of programming excellence of candidates. Those indicators included how often people were engaging on a specific manga website. This, however, does not account for the fact that women may not have the time for such banalities (see: how much unpaid work is done by women), or may be penalized for such an anti-social behavior (see: expected female traits).

Therefore, Gild's system is biased towards men but they hide behind secrecy and the absurd notion of the purity of algorithms (see: what are algorithms).

Tags: #gender-bias #gender-bias-in-business #gender-bias-in-ai


Backlinks

These are the other notes that link to this one.

Comment

Share your thoughts on this note
Aquiles Carattino
Aquiles Carattino
This note you are reading is part of my digital garden. Follow the links to learn more, and remember that these notes evolve over time. After all, this website is not a blog.
© 2021 Aquiles Carattino
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License
Privacy Policy