Amazon dispenses with artificial intelligence recruitment for discriminating against women | Technology

Amazon dispenses with artificial intelligence recruitment for discriminating against women | Technology


Amazon experts began to build in 2014 an artificial intelligence system to review the curriculum of job applicants. The goal was to mechanize the search for the best talent. This tool, based on the archives of the last 10 years of the company, learned that men were preferable and began to discriminate against women, as reported by Amazon sources. to Reuters. After knowing the error, the company has decided to dispense with this system.

The tool scored candidates from one to five stars, just as customers can rate any product on the company's website. "It was like the Holy Grail, everyone wanted it," one of the employees explained to Amazon, "They wanted it to be a system in which you entered 100 CVs and selected the five best ones to hire them." affirmed

But in 2015 the company realized that this tool discriminated against women candidates for software developer jobs and other technical positions in the company. Instead, he preferred to hire men. "This is what happens when you train a system with data that you think is not biased but that really is," explains Nerea Luis Mingueza, a PhD student in Artificial Intelligence and founder of T3chfest, a technological event at the Carlos III Univerisity of Madrid.

The Amazon system penalized applicants whose curriculums included the word "women." For example, to a girl who explained that she had been "captain of a women's chess club". It also discriminated against graduates of two universities for girls, according to Reuters.

The artificial intelligence had been trained with the profiles of job seekers of the last 10 years. The majority of applicants in that decade, and therefore of people hired, were men. "If you see that the system is always classifying men as better, you have to ask why. You do not have to think he's doing it right, "says Luis.

When the company realized the error, it edited the programs so that they stopped discriminating against women. But, according to sources consulted by Reuters, that was not a guarantee that the system would not fail again or be unfair when classifying the candidates. Therefore, executives lost hope in the project and the Seattle company eventually abandoned the project.

An Amazon worker, during a last labor protest.
An Amazon worker, during a last labor protest.

Amazon recruiters took into account the recommendations of the system to hire new workers but never relied solely on these classifications, according to the same sources. However, Amazon has explained that the tool "was never used by the company's recruiters to evaluate the candidates." The company has not come to assess whether the human resources managers took into account in some way the recommendations generated by the system.

Gender gap

The main technological companies of the USA they still have to reduce the gender gap in hiring, which is usually more pronounced among the technical staff. Cristina Aranda, in charge of presale and innovation in Intelygenz Y cofounder of MujeresTech, He notes that in this case "nobody has stopped to think that men express themselves differently than women". As she explains, there are girls who suffer from the impostor syndrome: "It was formulated by a psychologist and consists of not believing herself capable of performing certain tasks and of attributing the achievements to external agents and not to themselves".

Automation in warehouses and electronic commerce has been key for Amazon when it comes to strengthening its dominance in the market. More and more companies are trying to automate parts of the hiring process. This is the case of large companies such as Hilton Worldwide Holdings Inc and Goldman Sachs Group Inc. About 55% of human resources managers in the United States believe that artificial intelligence will be used regularly in their work over the next five years, according to a survey conducted in 2017 by the talent software firm CareerBuilder.

The experts consulted agree that artificial intelligence can be useful for recruiters as support to apply a first filter. "He works very well with data and performs a task that hurts us a lot," says Luis. In addition, these systems can avoid the biases of a human, according to Ana de Prado, engineer and manager of artificial intelligence team Terminus 7. To avoid error, De Prado believes that the algorithm should not have taken into account the variable sex of the person: "There are already recruitment systems in which it is not necessary to include photographs, age or even the name of the applicants."

Many specialists have been denouncing for years that artificial intelligence not only does not avoid the human error derived from their prejudices, but that It can worsen discrimination and reinforce many stereotypes. There have also been cases in which a system has discriminated against people based on race or even sexual orientation. "For example, with computer vision systems [visión artificial] there have been problems to detect black people because they have been trained with white people, "says Luis. For her, in addition to an exhaustive analysis of cases such as Amazon, training in diversity and the inclusion of women in the teams that design the systems is paramount.

.



Source link