{"id":10409,"date":"2017-11-10T04:45:03","date_gmt":"2017-11-10T12:45:03","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2017\/11\/10\/news-4182\/"},"modified":"2017-11-10T04:45:03","modified_gmt":"2017-11-10T12:45:03","slug":"news-4182","status":"publish","type":"post","link":"http:\/\/www.palada.net\/index.php\/2017\/11\/10\/news-4182\/","title":{"rendered":"Actually, Facebook Will Not Blur the Nudes Sent to Its Anti-Revenge Porn Program"},"content":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/video-images.vice.com\/articles\/5a04f27a1095bf0bd2083f4c\/lede\/1510273746211-Untitled-design.jpeg\"\/><\/p>\n<p><strong>Credit to Author: Louise Matsakis| Date: Fri, 10 Nov 2017 12:00:00 +0000<\/strong><\/p>\n<p> Earlier this week, I reported on a new pilot program <a href=\"https:\/\/motherboard.vice.com\/en_us\/article\/7x478b\/facebook-revenge-porn-nudes\" target=\"_blank\">designed<\/a> to combat revenge porn that Facebook is testing in Australia. The program involves sending nude photographs to the social network ahead of time, to prevent their potential spread in the future. <\/p>\n<p> Security researchers and journalists\u2014including me\u2014had additional questions about exactly how it works, so I reached out to Facebook to ask. I then wrote a follow-up <a href=\"https:\/\/motherboard.vice.com\/en_us\/article\/d3d5gx\/real-humans-will-review-the-nudes-you-send-facebook-as-part-of-its-anti-revenge-porn-program\" target=\"_blank\">piece<\/a> with more details the company provided, most important of which was that actual humans will review the explicit photos sent to the social network manually.<\/p>\n<p> In my conversations with Facebook however, I discovered that the initial way the anti-revenge porn feature was described to me was incorrect. The Facebook spokesperson I talked to misrepresented a key detail of the program, and then later confirmed to me that the feature did not work the way that they said that it would. Facebook said that the nude images would be blurred when they were reviewed by humans, but that is not the case. <\/p>\n<p> This is how the anti-revenge porn pilot program is going to work in Australia, according to a <a href=\"https:\/\/newsroom.fb.com\/news\/h\/non-consensual-intimate-image-pilot-the-facts\/\" target=\"_blank\">blog post<\/a> Facebook published Thursday: First, users file a report with the country\u2019s eSafety Commissioner\u2019s office, saying that they want to preemptively report photos as being explicit, to prevent their spread in the future as revenge porn. The user is then asked to send the photos to themselves via Facebook Messenger. Facebook is made aware of the report, and then an actual human, a member of Facebook\u2019s Community Operations team, looks at the nude photograph to ensure that it violates Facebook\u2019s standards.<\/p>\n<p> This is where a Facebook spokesperson initially described the process incorrectly. When I first reached out to ask how the program would work, I was told that the images would be blurred. This was repeated to me twice. <\/p>\n<p> Facebook\u2019s blog post didn\u2019t mention blurring at all, and what I was told was contradictory to <a href=\"https:\/\/www.thedailybeast.com\/facebook-workers-not-an-algorithm-will-look-at-volunteered-nude-photos-first-to-stop-revenge-porn\" target=\"_blank\">other reporting<\/a>, so I reached out to Facebook again, in the hopes of clearing up the confusion. Finally, I was told that there will actually be no blurring. This means that Australians who want to use the new program have to voluntarily decide they are comfortable with Facebook\u2019s Community Operations team seeing them naked. \u201cTo prevent adversarial reporting, at this time we need to have humans review the images in a controlled, secure environment,\u201d Facebook\u2019s Chief Security Officer Alex Stamos said in a <a href=\"https:\/\/twitter.com\/alexstamos\/status\/928742797393395719\" target=\"_blank\">tweet<\/a>.<\/p>\n<p> Facebook is having human reviewers be part of the process in order to prevent legitimate images from being inadvertently tagged as revenge porn. As recent studies have shown, image-recognition algorithms are still <a href=\"https:\/\/motherboard.vice.com\/en_us\/article\/pa339b\/how-to-fool-artificial-intelligence-one-pixel\" target=\"_blank\">extremely easy<\/a> to spoof. <\/p>\n<p> After the images are reviewed by a human, what is referred to as a \u201chash\u201d or unique digital fingerprint is built. Facebook does not retain the images themselves, just the hashes. Once it has created a hash, it notifies the person who uploaded the original image, and they are asked to delete it from Messenger. Then Facebook deletes the image from its servers, retaining only the hash.<\/p>\n<p> Each time a user subsequently uploads an image, it\u2019s tested against Facebook\u2019s database of hashes. If one matches a hash labeled revenge porn, Facebook stops the user from posting it. As Stamos pointed out in his tweets, this is an imperfect solution to an incredibly difficult problem. <\/p>\n<p> It\u2019s unclear why this happened, but I think it\u2019s important to highlight the confusion. Major tech corporations like Facebook are incredibly careful about how they message themselves to the public, and this is an instance in which it initially got the communication about its own program wrong.<\/p>\n<p><a href=\"https:\/\/motherboard.vice.com\/en_us\/article\/3kvje3\/actually-facebook-will-not-blur-the-nudes-sent-to-its-anti-revenge-porn-program\" target=\"bwo\" >https:\/\/motherboard.vice.com\/en_us\/rss<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/video-images.vice.com\/articles\/5a04f27a1095bf0bd2083f4c\/lede\/1510273746211-Untitled-design.jpeg\"\/><\/p>\n<p><strong>Credit to Author: Louise Matsakis| Date: Fri, 10 Nov 2017 12:00:00 +0000<\/strong><\/p>\n<p>The company originally told me it was obscuring the images, but then said it wasn\u2019t.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10643,13328,10378],"tags":[3589,15016,5897,16353,1932],"class_list":["post-10409","post","type-post","status-publish","format-standard","hentry","category-independent","category-motherboard","category-security","tag-facebook","tag-nudes","tag-privacy","tag-revenge-porn","tag-social-media"],"_links":{"self":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/10409","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=10409"}],"version-history":[{"count":0,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/10409\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=10409"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=10409"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=10409"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}