{"id":9552,"date":"2017-09-27T07:45:28","date_gmt":"2017-09-27T15:45:28","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2017\/09\/27\/news-3325\/"},"modified":"2017-09-27T07:45:28","modified_gmt":"2017-09-27T15:45:28","slug":"news-3325","status":"publish","type":"post","link":"https:\/\/www.palada.net\/index.php\/2017\/09\/27\/news-3325\/","title":{"rendered":"The Algorithms Aren\u2019t Working for the Rest of Us"},"content":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/video-images.vice.com\/articles\/59cbb056a2e98b686150707a\/lede\/1506521176198-shutterstock_596106743.jpeg\"\/><\/p>\n<p><strong>Credit to Author: Louise Matsakis| Date: Wed, 27 Sep 2017 14:51:04 +0000<\/strong><\/p>\n<p> I used to enjoy Instagram. Photos in the feed were displayed in reverse-chronological order. The newest posts were at the top, and the oldest posts at the bottom. It was easy to scroll though. Then, last year, Instagram <a href=\"https:\/\/www.theguardian.com\/technology\/2016\/jun\/07\/new-algorithm-driven-instagram-feed-rolled-out-to-the-dismay-of-users\" target=\"_blank\">changed<\/a> the algorithm. Now, pictures appear in an order I don&#8217;t understand. A snapshot from a birthday party three days ago is displayed next to a selfie only hours old. Time on Instagram now feels distorted.<\/p>\n<p> The algorithm stopped working for me. The change provided a benign reminder that the engineers who run Instagram&#8217;s algorithm (and other social networks like it) ultimately decide what information I get to see, and in what order. <\/p>\n<p> Now, other algorithms aren&#8217;t working, and this time it&#8217;s worse. Oversights in the systems that help run Facebook, Instagram, Google, and Twitter uncovered by journalists over the last month prove again that code carries human bias and mistakes. Algorithms aren&#8217;t perfect mathematical equations\u2014they&#8217;re designed by people, and like humans, have flaws. <\/p>\n<p> It&#8217;s likely the algorithms haven&#8217;t been working for some time. But now, we&#8217;re taking notice:<\/p>\n<ul>\n<li>Facebook&#8217;s advertising algorithm <a href=\"http:\/\/www.slate.com\/blogs\/future_tense\/2017\/09\/25\/facebook_blocked_an_ad_for_a_portland_march_against_white_supremacy.html\" target=\"_blank\">blocked<\/a> an ad for a march <i> against<\/i> white supremacy in Portland, Oregon. The ad was up for two hours last month, then got removed. The company won&#8217;t say why.<\/li>\n<li>The company&#8217;s advertising algorithm also <a href=\"https:\/\/www.propublica.org\/article\/facebook-enabled-advertisers-to-reach-jew-haters\" target=\"_blank\">let marketers<\/a> target users who expressed interest in topics like &#8220;Jew hater.&#8221; Facebook automatically generated the ad category based on their online activity. In an ironic twist, Facebook then <a href=\"https:\/\/twitter.com\/JuliaAngwin\/status\/908806048525451264?ref_src=twsrc%5Etfw&#038;ref_url=https%3A%2F%2Fwww.theguardian.com%2Ftechnology%2F2017%2Fsep%2F21%2Finstagram-death-threat-facebook-olivia-solon\" target=\"_blank\">sent<\/a> an automated email to a ProPublica reporter asking if she wanted to buy an ad to promote her story that exposed Facebook&#8217;s anti-semitic advertising categories.<\/li>\n<li>Facebook <a href=\"http:\/\/www.slate.com\/blogs\/future_tense\/2017\/09\/14\/facebook_let_advertisers_target_jew_haters_it_doesn_t_end_there.html\" target=\"_blank\">also let<\/a> marketers aim their ads at users who expressed interest in other hateful ideas, like &#8220;killing bitches&#8221; or &#8220;threesome rape.&#8221;<\/li>\n<li>Most startlingly, a troll farm with ties to the Russian government <a href=\"https:\/\/www.washingtonpost.com\/politics\/facebook-says-it-sold-political-ads-to-russian-company-during-2016-election\/2017\/09\/06\/32f01fd2-931e-11e7-89fa-bb822a46da5b_story.html?utm_term=.925bfff1fb63\" target=\"_blank\">successfully bought<\/a> $100,000 worth of Facebook ads targeted at US voters from 2015 to 2017. They were <a href=\"https:\/\/www.washingtonpost.com\/business\/technology\/russian-operatives-used-facebook-ads-to-exploit-divisions-over-black-political-activism-and-muslims\/2017\/09\/25\/4a011242-a21b-11e7-ade1-76d061d56efa_story.html?tid=sm_tw&#038;utm_term=.1638c73e75bb\" target=\"_blank\">designed<\/a> to escalate social tensions and sow political unrest.<\/li>\n<li>Twitter&#8217;s advertising platform let users <a href=\"http:\/\/www.thedailybeast.com\/twitter-lets-you-target-millions-of-users-who-may-like-the-n-word\" target=\"_blank\">target people<\/a> who used racist phrases like the n-word.<\/li>\n<li>Google let marketers <a href=\"https:\/\/www.buzzfeed.com\/alexkantrowitz\/google-allowed-advertisers-to-target-jewish-parasite-black?utm_term=.gs5daJ4104#.ejNX3pvYav\" target=\"_blank\">reach people<\/a> using key phrases such as &#8220;black people ruin neighborhoods.&#8221;<\/li>\n<li>Google&#8217;s search tool unmasked the names of young criminal offenders and their victims sealed by law in Canada. A search for the name of an offender or victim <a href=\"http:\/\/www.ottawasun.com\/2017\/09\/21\/google-is-linking-secret-court-protected-names---including-victim-ids---to-online-coverage\" target=\"_blank\">returned<\/a> media coverage of their courts cases, despite that their names do not appear in the articles themselves.<\/li>\n<li>Instagram&#8217;s algorithm <a href=\"https:\/\/www.theguardian.com\/technology\/2017\/sep\/21\/instagram-death-threat-facebook-olivia-solon\" target=\"_blank\">used<\/a> an image with the text &#8220;I will rape you before I kill you, you filthy whore!&#8221; to advertise its platform on Facebook. The post, made nearly a year ago by the Guardian tech writer <a href=\"https:\/\/twitter.com\/oliviasolon\" target=\"_blank\">Olivia Solon<\/a>, was one of her most &#8220;engaged,&#8221; which might be why it was selected. It was a screenshot of a threatening email she received.<\/li>\n<\/ul>\n<p>These aren&#8217;t going to be the last news stories that chronicle the algorithms blind spots. The only reason a company like Facebook is able to accommodate over <a href=\"https:\/\/www.facebook.com\/zuck\/posts\/10103831654565331?pnref=story\" target=\"_blank\">two billion<\/a> users and generate over $9 billion in <a href=\"http:\/\/www.adweek.com\/digital\/facebook-raked-in-9-16-billion-in-ad-revenue-in-the-second-quarter-of-2017\/\" target=\"_blank\">ad revenue<\/a> last quarter is because it automates its services to some degree. It&#8217;s essential to its business and that of every major internet platform.<\/p>\n<p> But as the avalanche of examples uncovered over the last month show, the algorithms just aren&#8217;t designed with every bias or potential pitfall in mind. Facebook&#8217;s automated programs are intended to generate eyeballs for advertisements. They aren&#8217;t built to positively shape our society, or ensure we learn more about the world around us.<\/p>\n<p>Clearly, Facebook is an incredibly successful company. Mark Zuckerberg is the <a href=\"https:\/\/www.forbes.com\/sites\/noahkirsch\/2017\/07\/27\/mark-zuckerbergs-net-worth-rises-4-billion-in-a-day-to-become-worlds-5th-richest\/#60976d0b4e97\" target=\"_blank\">fifth richest<\/a> person in the world. To be fair, it seems like the algorithms are working for him just fine. They&#8217;re just not working for the rest of us.<\/p>\n<p><a href=\"https:\/\/motherboard.vice.com\/en_us\/article\/3kap95\/the-algorithms-arent-working-for-the-rest-of-us\" target=\"bwo\" >https:\/\/motherboard.vice.com\/en_us\/rss<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/video-images.vice.com\/articles\/59cbb056a2e98b686150707a\/lede\/1506521176198-shutterstock_596106743.jpeg\"\/><\/p>\n<p><strong>Credit to Author: Louise Matsakis| Date: Wed, 27 Sep 2017 14:51:04 +0000<\/strong><\/p>\n<p>Code carries human bias and mistakes.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10643,13328,10378],"tags":[15170,12329,15169,3589,2143,1956,454],"class_list":["post-9552","post","type-post","status-publish","format-standard","hentry","category-independent","category-motherboard","category-security","tag-advertisments","tag-algorithm","tag-bias","tag-facebook","tag-instagram","tag-racism","tag-twitter"],"_links":{"self":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/9552","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=9552"}],"version-history":[{"count":0,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/9552\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=9552"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=9552"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=9552"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}