This project has been funded with support from the European Commission. The author is solely responsible for this publication (communication) and the Commission accepts no responsibility for any use may be made of the information contained therein. In compliance of the new GDPR framework, please note that the Partnership will only process your personal data in the sole interest and purpose of the project and without any prejudice to your rights.

Case Study 2 - Biased Amazon Recruitment Algorithm – A Lesson on AI Bias

In 2014, a team of machine learning specialists at Amazon began working on an innovative AIpowered recruitment tool. The goal of the project was to automate the resume screening process and identify top candidates for technical roles, including software engineering positions. The system rated applications on a scale from one to five stars—similar to how customers rate products on Amazon’s platform. By 2015, the team discovered a serious issue: the algorithm wassystematically biased against women applying for technical roles. The tool had been trained on résumés submitted to the company over the previous decade—most of which came from men, reflecting the male dominance in the tech industry.

Read More

menu