{"id":12927,"date":"2018-07-27T10:45:10","date_gmt":"2018-07-27T18:45:10","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2018\/07\/27\/news-6694\/"},"modified":"2018-07-27T10:45:10","modified_gmt":"2018-07-27T18:45:10","slug":"news-6694","status":"publish","type":"post","link":"http:\/\/www.palada.net\/index.php\/2018\/07\/27\/news-6694\/","title":{"rendered":"Amazon&#8217;s Facial Recognition System Mistakes Members of Congress for Mugshots"},"content":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/5b5a34fb88199e0b65255461\/master\/pass\/Amazon%20facial%20recognition%20fails%20an%20easy%20test.jpg\"\/><\/p>\n<p><strong>Credit to Author: Brian Barrett| Date: Thu, 26 Jul 2018 20:59:38 +0000<\/strong><\/p>\n<p><span class=\"lede\">Amazon touts its <\/span>Rekognition facial recognition system as \u201c<a href=\"https:\/\/aws.amazon.com\/rekognition\/\" target=\"_blank\">simple and easy to use<\/a>,\u201d encouraging customers to \u201cdetect, analyze, and compare faces for a wide variety of user verification, people counting, and public safety use cases.\u201d And yet, in a <a href=\"https:\/\/www.aclu.org\/blog\/privacy-technology\/surveillance-technologies\/amazons-face-recognition-falsely-matched-28\" target=\"_blank\">study<\/a> released Thursday by the American Civil Liberties Union, the technology managed to confuse photos of 28 members of Congress with publicly available mug shots. Given that Amazon actively <a href=\"https:\/\/www.aclu.org\/blog\/privacy-technology\/surveillance-technologies\/amazon-teams-government-deploy-dangerous-new\" target=\"_blank\">markets<\/a> Rekognition to law enforcement agencies across the US, that\u2019s simply not good enough.<\/p>\n<p>The ACLU study also illustrated the <a href=\"https:\/\/www.wired.com\/story\/photo-algorithms-id-white-men-fineblack-women-not-so-much\/\">racial bias<\/a> that plagues facial recognition today. &quot;Nearly 40 percent of Rekognition\u2019s false matches in our test were of people of color, even though they make up only 20 percent of Congress,&quot; <a href=\"https:\/\/www.aclu.org\/blog\/privacy-technology\/surveillance-technologies\/amazons-face-recognition-falsely-matched-28\" target=\"_blank\">wrote<\/a> ACLU attorney Jacob Snow. \u201cPeople of color are already disproportionately harmed by police practices, and it\u2019s easy to see how Rekognition could exacerbate that.&quot;<\/p>\n<p class=\"paywall\">Facial recognition technology\u2019s difficulty detecting darker skin tones is a well-established problem. In February, MIT Media Lab\u2019s Joy Buolamwini and Microsoft\u2019s Timnit Gebru published <a href=\"http:\/\/gendershades.org\/\" target=\"_blank\">findings<\/a> that facial recognition software from IBM, Microsoft, and Face++ have a much harder time identifying gender in people of color than in white people. In a June evaluation of Amazon Rekognition, Buolamwini and Inioluwa Raji of the Algorithmic Justice League found similar built-in bias. Rekognition managed to even <a href=\"https:\/\/youtu.be\/QxuyfWoVV98\" target=\"_blank\">get Oprah wrong<\/a>.<\/p>\n<p class=\"paywall\">\u201cGiven what we know about the biased history and present of policing, the concerning performance metrics of facial analysis technology in real-world pilots, and Rekognition\u2019s gender and skin-type accuracy differences,\u201d Buolamwini wrote in a <a href=\"https:\/\/uploads.strikinglycdn.com\/files\/e286dfe0-763b-4433-9a4b-7ae610e2dba1\/RekognitionGenderandSkinTypeDisparities-June25-Mr.%20Bezos.pdf\" target=\"_blank\">recent letter<\/a> to Amazon CEO Jeff Bezos, \u201cI join the chorus of dissent in calling Amazon to stop equipping law enforcement with facial analysis technology.\u201d<\/p>\n<p>&#x27;We wouldn\u2019t find this acceptable in any other setting. Why should we find it acceptable here?&#x27;<\/p>\n<p name=\"inset-left\" class=\"inset-left-component__el\">Alvaro Bedoya, Center on Privacy and Technology<\/p>\n<p class=\"paywall\">Yet Amazon Rekognition is already in active use in Oregon\u2019s Washington County. And the Orlando, Florida police department recently resumed a pilot program to test Rekognition\u2019s efficacy, although the city says that for now, \u201cno images of the public will be used for any testing\u2014only images of Orlando police officers who have volunteered to participate in the test pilot will be used.\u201d Those are just the clients that are public; Amazon declined to comment on the full scope of law enforcement\u2019s use of Rekognition.<\/p>\n<p class=\"paywall\">For privacy advocates, though, any amount is too much, especially given the system\u2019s demonstrated bias. \u201cImagine a speed camera that wrongly said that black drivers were speeding at higher rates than white drivers. Then imagine that law enforcement knows about this, and everyone else knows about this, and they just keep using it,\u201d says Alvaro Bedoya, executive director of Georgetown University\u2019s Center on Privacy and Technology. \u201cWe wouldn\u2019t find this acceptable in any other setting. Why should we find it acceptable here?\u201d<\/p>\n<p class=\"paywall\">Amazon takes issue with the parameters of the study, noting that the ACLU used an 80 percent confidence threshold; that\u2019s the likelihood that Rekognition found a match, which you can adjust according to your desired level of accuracy. \u201cWhile 80 percent confidence is an acceptable threshold for photos of hot dogs, chairs, animals, or other social media use cases, it wouldn\u2019t be appropriate for identifying individuals with a reasonable level of certainty,\u201d the company said in a statement. \u201cWhen using facial recognition for law enforcement activities, we guide customers to set a threshold of at least 95 percent or higher.\u201d<\/p>\n<p class=\"paywall\">While Amazon says it works closely with its partners, it\u2019s unclear what form that guidance takes, or whether law enforcement follows it. Ultimately, the onus is on the customers\u2014including law enforcement\u2014to make the adjustment. An Orlando Police Department spokesperson did not know how the city had calibrated Rekognition for its pilot program.<\/p>\n<p class=\"paywall\">The ACLU counters that 80 percent is Rekognition\u2019s default setting. And UC Berkeley computer scientist Joshua Kroll, who independently verified the ACLU\u2019s findings, notes that if anything, the professionally photographed, face-forward congressional portraits used in the study are a softball compared to what Rekognition would encounter in the real world.<\/p>\n<p class=\"paywall\">\u201cAs far as I can tell, this is the easiest possible case for this technology to work,\u201d Kroll says. \u201cWhile we haven\u2019t tested it, I would naturally anticipate that it would perform worse in the field environment, where you\u2019re not seeing people\u2019s faces straight on, you might not have perfect lighting, you might have some occlusion, maybe people are wearing things or carrying things that get in the way of their faces.\u201d<\/p>\n<p class=\"paywall\">Amazon also downplays the potential implications of facial recognition errors. \u201cIn real world scenarios, Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgement,\u201d the company\u2019s statement reads. But that elides the very real consequences that could be felt by those who are wrongly identified.<\/p>\n<p class=\"paywall\">\u201cAt a minimum, those people are going to be investigated. Point me to a person that likes to be investigated by law enforcement,\u201d Bedoya says. \u201cThis idea that there\u2019s no cost to misidentifications just defies logic.\u201d<\/p>\n<p>&#x27;What we\u2019re trying to avoid here is mass surveillance.&#x27;<\/p>\n<p name=\"inset-left\" class=\"inset-left-component__el\">Jeramie Scott, EPIC<\/p>\n<p class=\"paywall\">So, too, does the notion that a human backstop provides an adequate check on the system. \u201cOften with technology, people start to rely on it too much, as if it\u2019s infallible,\u201d says Jeramie Scott, director of the Electronic Privacy Information Center\u2019s Domestic Surveillance Project. In 2009, for instance, San Francisco police <a href=\"https:\/\/arstechnica.com\/tech-policy\/2014\/05\/after-being-held-at-gunpoint-due-to-lpr-error-woman-gets-day-in-court\/\" target=\"_blank\">handcuffed a woman<\/a> and held her at gunpoint after a license-plate reader misidentified her car. All they had to do to avoid the confrontation was to look at the plate themselves, or notice that the make, model, and color didn\u2019t match. Instead, they trusted the machine.<\/p>\n<p class=\"paywall\">Even if facial recognition technology worked perfectly, putting it in the hands of law enforcement would still raise concerns. \u201cFacial recognition destroys the ability to remain anonymous. It increases the ability of law enforcement to surveil individuals not suspected of crimes. It can chill First Amendment-protected rights and activities,\u201d Scott says. \u201cWhat we\u2019re trying to avoid here is mass surveillance.\u201d<\/p>\n<p class=\"paywall\">While the ACLU study covers well-trod ground in terms of facial recognition\u2019s faults, it may have a better chance at making real impact. \u201cThe most powerful aspect of this is that it makes it personal for members of Congress,\u201d says Bedoya. Members of the Congressional Black Caucus had previously written a <a href=\"https:\/\/cbc.house.gov\/news\/documentsingle.aspx?DocumentID=896\" target=\"_blank\">letter<\/a> to Amazon expressing related concerns, but the ACLU appears to have <a href=\"https:\/\/www.buzzfeednews.com\/article\/daveyalba\/amazon-rekognition-facial-recognition-congress-false\" target=\"_blank\">gotten the attention<\/a> of several additional lawmakers.<\/p>\n<p class=\"paywall\">The trick, though, will be turning that concern into action. Privacy advocates say that at a minimum, law enforcement\u2019s use of facial recognition technology should be heavily restricted until its racial bias has been corrected and its accuracy assured. And even then, they argue, its scope needs to be limited, and clearly defined. Until that happens, it\u2019s time not to pump the brakes but to slam down on them with both feet.<\/p>\n<p class=\"paywall\">\u201cA technology that\u2019s proven to vary significantly across people based on the color of their skin is unacceptable in 21st-century policing,\u201d says Bedoya.<\/p>\n<p class=\"paywall\"><em>This story has been updated to reflect that Timnit Gebru was not involved in Joy Buolamwini&#x27;s Amazon Rekognition research.<\/em><\/p>\n<p class=\"related-cne-video-component__dek\">Virtual assistants like Google Home and Amazon Alexa can be amazing but what are they doing with all of your questions? Here&#39;s how to control all of that data.<\/p>\n<p><a href=\"https:\/\/www.wired.com\/story\/amazon-facial-recognition-congress-bias-law-enforcement\" target=\"bwo\" >https:\/\/www.wired.com\/category\/security\/feed\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/5b5a34fb88199e0b65255461\/master\/pass\/Amazon%20facial%20recognition%20fails%20an%20easy%20test.jpg\"\/><\/p>\n<p><strong>Credit to Author: Brian Barrett| Date: Thu, 26 Jul 2018 20:59:38 +0000<\/strong><\/p>\n<p>Amazon has marketed its Rekognition facial recognition system to law enforcement. But in a new ACLU study, the technology confused 28 members of Congress with publicly available arrest photos.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10378,10607],"tags":[714],"class_list":["post-12927","post","type-post","status-publish","format-standard","hentry","category-security","category-wired","tag-security"],"_links":{"self":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/12927","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=12927"}],"version-history":[{"count":0,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/12927\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=12927"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=12927"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=12927"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}