{"id":3208,"date":"2018-10-13T02:18:53","date_gmt":"2018-10-13T02:18:53","guid":{"rendered":"http:\/\/www.styledeals.co.uk\/blog\/amazon-scrapped-sexist-ai-tool\/"},"modified":"2018-10-13T02:18:53","modified_gmt":"2018-10-13T02:18:53","slug":"amazon-scrapped-sexist-ai-tool","status":"publish","type":"post","link":"https:\/\/www.styledeals.co.uk\/blog\/amazon-scrapped-sexist-ai-tool\/","title":{"rendered":"Amazon scrapped &#8216;sexist AI&#8217; tool"},"content":{"rendered":"\n<div property=\"articleBody\">\n<figure class=\"media-landscape has-caption full-width lead\"><span class=\"image-and-copyright-container\"><\/p>\n<p>                <img loading=\"lazy\" decoding=\"async\" class=\"js-image-replace\" alt=\"Women and men in office on laptops\" src=\"https:\/\/ichef.bbci.co.uk\/news\/320\/cpsprodpb\/6E59\/production\/_103794282_recruitment2.gif\" width=\"976\" height=\"549\"\/><span class=\"off-screen\">Image copyright<\/span><br \/>\n                 <span class=\"story-image-copyright\">Getty Images<\/span><\/p>\n<p>            <\/span><figcaption class=\"media-caption\"><span class=\"off-screen\">Image caption<\/span><br \/>\n                <span class=\"media-caption__text\"><br \/>\n                    The algorithm repeated bias towards men, reflected in the technology industry<br \/>\n                <\/span><br \/>\n            <\/figcaption><\/figure>\n<p class=\"story-body__introduction\">An algorithm that was being tested as a recruitment tool by online giant Amazon was sexist and had to be scrapped, according to a Reuters report.<\/p>\n<p>The artificial intelligence system was trained on data submitted by applicants over a 10-year period, much of which came from men, it claimed.<\/p>\n<p>Reuters was told by members of the team working on it that the system effectively taught itself that male candidates were preferable.<\/p>\n<p>Amazon has not responded to the claims.<\/p>\n<p>Reuters spoke to five members of the team who developed the machine learning tool in 2014, none of whom wanted to be publicly named. <\/p>\n<p>They told Reuters that the system was intended to review job applications and give candidates a score ranging from one to five stars.<\/p>\n<p>&#8220;They literally wanted it to be an engine where I&#8217;m going to give you 100 resumes, it will spit out the top five, and we&#8217;ll hire those,&#8221; said one of the engineers who spoke to Reuters.<\/p>\n<h2 class=\"story-body__crosshead\">&#8216;Women&#8217; penalised<\/h2>\n<p>By 2015, it was clear that the system was not rating candidates in a gender-neutral way because it was built on data accumulated from CVs submitted to the firm mostly from males, Reuters claimed.<\/p>\n<p>The system started to penalise CVs which included the word &#8220;women&#8221;. The program was edited to make it neutral to the term but it became clear that the system could not be relied upon, Reuters was told.<\/p>\n<p>The project was abandoned, although Reuters said that it was used for a period by recruiters who looked at the recommendations generated by the tool but never relied solely on it.<\/p>\n<p>According to Amazon, its current global workforce is split 60:40 in favour of males.<\/p>\n<p>About 55% of US human resources managers said that AI would play a role in recruitment within the next five years, according to a survey by software firm CareerBuilder.<\/p>\n<p>It is not the first time doubts have been raised about how reliable algorithms trained on potentially biased data will be.<\/p>\n<figure class=\"media-landscape has-caption full-width\"><span class=\"image-and-copyright-container\"><\/p>\n<p>                 <span class=\"off-screen\">Image copyright<\/span><br \/>\n                 <span class=\"story-image-copyright\">MIT<\/span><\/p>\n<p>            <\/span><figcaption class=\"media-caption\"><span class=\"off-screen\">Image caption<\/span><br \/>\n                <span class=\"media-caption__text\"><br \/>\n                    An MIT AI system, dubbed Norman, had a dark view of the world as a result of the data it was trained on<br \/>\n                <\/span><br \/>\n            <\/figcaption><\/figure>\n<p><a href=\"https:\/\/www.bbc.co.uk\/news\/technology-44040008\" class=\"story-body__link\">An experiment at the Massachusetts Institute of Technology, which trained an AI on images and videos of murder and death<\/a>, found it interpreted neutral inkblots in a negative way.<\/p>\n<p>And in May last year, a report claimed that an AI-generated computer program used by a US court was biased against black people, flagging them as twice as likely to reoffend as white people.<\/p>\n<p>Predictive policing algorithms were spotted to be similarly biased, because the crime data they were trained on showed more arrests or police stops for black people. <\/p>\n<\/p><\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/www.bbc.co.uk\/news\/technology-45809919\">Source<\/a> by <a href=\"\">[author_name]<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Image copyright Getty Images Image caption The algorithm repeated bias towards men, reflected in the technology industry An algorithm that was being tested as a recruitment tool by online giant Amazon was sexist and had to be scrapped, according to a Reuters report. The artificial intelligence system was trained on data submitted by applicants over &hellip; <\/p>\n","protected":false},"author":0,"featured_media":3209,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-3208","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-general"],"_links":{"self":[{"href":"https:\/\/www.styledeals.co.uk\/blog\/wp-json\/wp\/v2\/posts\/3208","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.styledeals.co.uk\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.styledeals.co.uk\/blog\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/www.styledeals.co.uk\/blog\/wp-json\/wp\/v2\/comments?post=3208"}],"version-history":[{"count":0,"href":"https:\/\/www.styledeals.co.uk\/blog\/wp-json\/wp\/v2\/posts\/3208\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.styledeals.co.uk\/blog\/wp-json\/wp\/v2\/media\/3209"}],"wp:attachment":[{"href":"https:\/\/www.styledeals.co.uk\/blog\/wp-json\/wp\/v2\/media?parent=3208"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.styledeals.co.uk\/blog\/wp-json\/wp\/v2\/categories?post=3208"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.styledeals.co.uk\/blog\/wp-json\/wp\/v2\/tags?post=3208"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}