Published: Sat, October 13, 2018
Tech | By

Amazon drops secret AI recruiting tool that showed bias against women

Amazon drops secret AI recruiting tool that showed bias against women

Amazon's engineers tweaked the system to remedy these particular forms of bias, but couldn't be sure the AI wouldn't find new ways to unfairly discriminate against candidates.

Although machine learning is already transforming our professional lives, technology specialists, as well as civil liberties groups such as the ACLU, say more work needs to be done to avoid issues like Amazon's.

Online giant, Amazon has abandoned an algorithm that was being tested as a recruitment tool because it was sexist.

Amazon's system taught itself to downgrade resumes with the word "women's" in them and to assign lower scores to graduates of two women-only colleges.

But by 2015 it was discovered that the system was not rating candidates for technical posts in a gender-neutral way. The system would crawl the web to recommend candidates. About 500 computer models were built and each was taught to recognize some 50,000 terms that showed up on past candidates' resumes. However, women are still working to catch up in the tech industry, and most of the resumes submitted to Amazon over the last 10 years were from men. This secret AI tool, that is said to have been under development since 2014, was reportedly shut down by the start of previous year. Why?

Some 55% of USA human resources managers said artificial intelligence would be a regular part of their work within the next five years, according to a 2017 survey by talent software firm CareerBuilder.

According to Reuters, the company's human resources department used the system to generate recommendations but never relied entirely on those recommendations when filtering candidates. There was just one problem: It overwhelmingly spit back men.

Still, John Jersin, vice president of LinkedIn Talent Solutions, said the service is not a replacement for traditional recruiters. The American Civil Liberties Union is presently challenging a law that allows criminal prosecution of researchers and journalists who test hiring website's algorithms for discrimination. It now uses a "much watered-down version" for administrative chores.

Another person said that a new team in Edinburgh has been formed to give automated employment screening another try, this time with a focus on diversity.

Like this: