{"id":8730,"date":"2017-08-14T14:46:21","date_gmt":"2017-08-14T22:46:21","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2017\/08\/14\/news-2503\/"},"modified":"2017-08-14T14:46:21","modified_gmt":"2017-08-14T22:46:21","slug":"news-2503","status":"publish","type":"post","link":"https:\/\/www.palada.net\/index.php\/2017\/08\/14\/news-2503\/","title":{"rendered":"Tech Companies Have the Tools to Confront White Supremacy"},"content":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/5991f76ea723bd58acbfd2aa\/master\/pass\/AltRightHP-831134374.jpg\"\/><\/p>\n<p><strong>Credit to Author: Issie Lapowsky| Date: Mon, 14 Aug 2017 22:24:55 +0000<\/strong><\/p>\n<p data-reactid=\"246\"><span class=\"lede\" data-reactid=\"247\"><!-- react-text: 248 -->Say you&#x27;re a <!-- \/react-text --><\/span><!-- react-text: 249 -->white supremacist who happens to hate Jewish people\u2014or black people, Muslim people, Latino people, take your pick. Today, you can <!-- \/react-text --><a href=\"https:\/\/www.wired.com\/2017\/04\/meme-army-now-militia\/\" data-reactid=\"250\"><!-- react-text: 251 -->communicate those views online<!-- \/react-text --><\/a><!-- react-text: 252 --> any number of ways without setting off many tech companies&#x27; anti\u2013hate speech alarm bells. And that&#x27;s a problem.<!-- \/react-text --><\/p>\n<p data-reactid=\"253\"><!-- react-text: 254 -->As the tech industry walks the narrow path between free speech and hate speech, it allows people with extremist ideologies to promote their brands and their beliefs on their platforms, as long as they swap out the violent rhetoric for dog whistles and obfuscating language. All the while, social media platforms allow these groups to amass and recruit followers, all under the guise of peaceful protest. The deadly riots in Charlottesville, Virginia, last weekend reveal they&#x27;re anything but. Now it&#x27;s up to those same tech companies to adjust their approaches to online hate\u2014as companies like <!-- \/react-text --><a href=\"https:\/\/www.washingtonpost.com\/news\/morning-mix\/wp\/2017\/08\/14\/godaddy-bans-neo-nazi-site-daily-stormer-for-disparaging-woman-killed-at-charlottesville-rally\/\" target=\"_blank\" data-reactid=\"255\"><!-- react-text: 256 -->GoDaddy<!-- \/react-text --><\/a><!-- react-text: 257 --> and <!-- \/react-text --><a href=\"https:\/\/www.polygon.com\/2017\/8\/14\/16145858\/discord-alt-right-server-statement\" target=\"_blank\" data-reactid=\"258\"><!-- react-text: 259 -->Discord<!-- \/react-text --><\/a><!-- react-text: 260 --> did on Monday, by shutting down hate groups on their services\u2014or risk enabling more offline violence in the future.<!-- \/react-text --><\/p>\n<p data-reactid=\"263\"><!-- react-text: 264 -->For the most part, as long as you don\u2019t use an online service to directly threaten anyone, or disparage groups of people based on their race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, or serious disability or disease\u2014policies laid out by Facebook, Twitter, and YouTube\u2014you can get away with practically anything. You can wrap your hate in lofty language about \u201cthe heritage, identity, and future of people of European descent,\u201d as white nationalist Richard Spencer does through his <!-- \/react-text --><a href=\"https:\/\/www.wired.com\/2016\/10\/alt-right-grew-obscure-racist-cabal\/\" data-reactid=\"265\"><!-- react-text: 266 -->supposed think tank, the National Policy Institute<!-- \/react-text --><\/a><!-- react-text: 267 -->. On Twitter, meanwhile, sharing a gas-chamber <!-- \/react-text --><a href=\"https:\/\/twitter.com\/bakedalaska\/status\/880210290444427264?lang=en\" target=\"_blank\" data-reactid=\"268\"><!-- react-text: 269 -->meme<!-- \/react-text --><\/a><!-- react-text: 270 --> garners just a one-week suspension.<!-- \/react-text --><\/p>\n<p data-reactid=\"271\"><!-- react-text: 272 -->\u201cSocial media has allowed [hate groups] to spread and share their messages in ways that was never before possible,\u201d says Jonathan Greenblatt, CEO of the Anti-Defamation League, which has tracked anti-Semitism and hate for more than a century. \u201cThey\u2019ve moved from the margins into the mainstream.\u201d<!-- \/react-text --><\/p>\n<p class=\"article-list-item-embed-component__title\" data-reactid=\"288\">The Alt-Right Can&#39;t Disown Charlottesville<\/p>\n<p class=\"article-list-item-embed-component__title\" data-reactid=\"298\">How the Alt-Right Grew From an Obscure Racist Cabal<\/p>\n<p class=\"article-list-item-embed-component__title\" data-reactid=\"308\">Google&#39;s Clever Plan to Stop Aspiring ISIS Recruits<\/p>\n<p data-reactid=\"309\"><!-- react-text: 310 -->This weekend\u2019s white supremacist march in Charlottesville, which left 32-year-old Heather Heyer dead after an apparent Nazi sympathizer rammed his vehicle into a crowd, injuring 19 others, <!-- \/react-text --><a href=\"https:\/\/www.wired.com\/story\/alt-right-charlottesville-reddit-4chan\/\" data-reactid=\"311\"><!-- react-text: 312 -->was organized out in the open<!-- \/react-text --><\/a><!-- react-text: 313 --> on the very platforms that claim to ban hate speech of any kind. The weekend\u2019s \u201cUnite the Right\u201d rally had its own Facebook page. On Reddit, members of the subreddit r\/The_Donald promoted the event in the days leading up to it. And bigots like former Ku Klux Klan leader David Duke used Twitter to issue foreboding warnings that the torch rally was \u201conly the beginning.\u201d<!-- \/react-text --><\/p>\n<p data-reactid=\"314\"><!-- react-text: 315 -->Under the banner of free speech, these tech companies allowed the rhetoric to not only live on their platforms but thrive there. That\u2019s because they operate using a simultaneously fuzzy and overly narrow set of rules around what constitutes banned behavior.<!-- \/react-text --><\/p>\n<p data-reactid=\"316\"><!-- react-text: 317 -->Twitter overtly allows \u201ccontroversial content,\u201d including from white supremacist accounts. It only takes action when those tweets threaten violence, incite fear in a group of people, or use explicit slurs.<!-- \/react-text --><\/p>\n<p data-reactid=\"318\"><!-- react-text: 319 -->Facebook, meanwhile, says that while it removes hate speech or any praise of violent acts and hate groups, it allows \u201cpeople to use Facebook to challenge ideas, institutions, and practices. And we allow groups to organize peaceful protests or rallies for or against things.\u201d<!-- \/react-text --><\/p>\n<p data-reactid=\"324\"><!-- react-text: 325 -->That distinction ignores social media&#x27;s well-known role as a tool of mass radicalization. Without explicitly espousing violence, these white-supremacist extremists can still recruit potential followers to a set of beliefs with deeply violent roots in Nazi Germany and the Jim Crow South. It should come as no surprise that a protest anchored in hate would erupt in violence. For tech companies to defend those online discussions as peaceful protests is disingenuous at best.<!-- \/react-text --><\/p>\n<p data-reactid=\"326\"><!-- react-text: 327 -->\u201cIt is their responsibility to figure out a way not to be complicit with these types of violent actions\u2014or become comfortable with the fact that they are,\u201d says Charlton McIlwain, an associate professor at New York University who focuses on race and digital media.<!-- \/react-text --><\/p>\n<p data-reactid=\"330\"><!-- react-text: 331 -->These are, after all, companies, not governments, meaning they\u2019re free to police speech in whatever way they deem appropriate. And in many cases, they already do. Twitter, Facebook, and YouTube have taken aggressive approaches to curbing ISIS activity on their platforms, a type of extremism they handle distinctly from hate speech. Facebook uses artificial intelligence to spot text that advocates for terrorism or terrorist groups, and deploys image-recognition technology to identify terror-related photos or memes. Less sensitive to the free speech rights of ISIS aspirants, Facebook even works to wipe out clusters of users that might have terrorist ties. \u201cWe use signals like whether an account is friends with a high number of accounts that have been disabled for terrorism, or whether an account shares the same attributes as a disabled account,\u201d the company recently explained in a <!-- \/react-text --><a href=\"https:\/\/newsroom.fb.com\/news\/2017\/06\/how-we-counter-terrorism\/\" target=\"_blank\" data-reactid=\"332\"><!-- react-text: 333 -->blog<!-- \/react-text --><\/a><!-- react-text: 334 --> post about the approach. Facebook-owned Instagram <!-- \/react-text --><a href=\"https:\/\/www.wired.com\/2017\/08\/instagram-kevin-systrom-wants-to-clean-up-the-internet\/\" data-reactid=\"335\"><!-- react-text: 336 -->recently introduced an algorithm<!-- \/react-text --><\/a><!-- react-text: 337 --> to wipe away comments from trolls.<!-- \/react-text --><\/p>\n<p data-reactid=\"338\"><!-- react-text: 339 -->YouTube, meanwhile, has gone so far as to deploy a <!-- \/react-text --><a href=\"https:\/\/www.wired.com\/2016\/09\/googles-clever-plan-stop-aspiring-isis-recruits\/\" data-reactid=\"340\"><!-- react-text: 341 -->tool known as the Redirect Method<!-- \/react-text --><\/a><!-- react-text: 342 -->, which serves anti-ISIS content to users searching for ISIS-related videos. Developed by Jigsaw, a think tank within YouTube\u2019s parent company Alphabet, the Redirect Method was designed to reach people who may be curious about extremist ideology before they become fully enveloped in it. \u201cLet\u2019s take these individuals who are vulnerable to ISIS\u2019s recruitment messaging and instead show them information that refutes it,\u201d Yasmin Green, Jigsaw\u2019s head of research and development, recently <!-- \/react-text --><a href=\"https:\/\/www.wired.com\/2017\/06\/hacking-online-hate-means-talking-humans-behind\/\" data-reactid=\"343\"><!-- react-text: 344 -->told<!-- \/react-text --><\/a><!-- react-text: 345 --> WIRED.<!-- \/react-text --><\/p>\n<p data-reactid=\"346\"><!-- react-text: 347 -->Now that the Department of Justice has <!-- \/react-text --><a href=\"https:\/\/www.nytimes.com\/2017\/08\/14\/us\/politics\/domestic-terrorism-sessions.html\" target=\"_blank\" data-reactid=\"348\"><!-- react-text: 349 -->deemed<!-- \/react-text --><\/a><!-- react-text: 350 --> Heyer\u2019s murder an act of domestic terrorism, it remains to be seen whether these companies will apply the same sort of rigor to white-supremacist groups. \u201cThese tech companies are very sophisticated. They\u2019ve dealt with issues like child pornography or pirated content or terrorist activity,\u201d Greenblatt says. \u201cI don\u2019t think any of the strategies are perfect, but applying some of those lessons learned from dealing with other public hazards would have a lot of value here.\u201d The ADL has formed a working group of tech companies, including Facebook, Google, Microsoft, Twitter, Yahoo, and YouTube, that is focused on addressing cyber hate.<!-- \/react-text --><\/p>\n<p data-reactid=\"353\"><!-- react-text: 354 -->Of course, having the tools to police white supremacists is different than using them. Social media companies would inevitably face a user backlash, and accusations of violating free speech. It&#x27;s up to them to decide whether taking a moral stance is worth the cost.<!-- \/react-text --><\/p>\n<p data-reactid=\"357\"><!-- react-text: 358 -->Despite the potential repercussions, some in Silicon Valley have already led the way. Before the rally in Charlottesville, Airbnb used background checks to block people it believed were attending the Unite the Right rally, specifically those who were organizing large events on the neo-Nazi website Daily Stormer.<!-- \/react-text --><\/p>\n<p data-reactid=\"359\"><!-- react-text: 360 -->\u201cWe investigated, and in some situations could confirm that users on our platform had booked listings for large gatherings that are affiliated with this event,\u201d an Airbnb spokesperson said, adding that the company \u201cevaluates these matters on a case-by-case basis.\u201d On Monday, the web-hosting company GoDaddy followed suit, announcing plans to boot the Daily Stormer from its service. Within hours, Daily Stormer had a home with Google. Shortly thereafter, Google canceled the hate website\u2019s domain registration as well.<!-- \/react-text --><\/p>\n<p data-reactid=\"363\"><!-- react-text: 364 -->Other popular alt-right destinations have pulled the plug too. On Monday, Discord, a gamer-focused chat platform, announced that it would shutter the popular altright.com server it had hosted. &quot;We will continue to take action against white supremacy, nazi ideology, and all forms of hate,&quot; the company <!-- \/react-text --><a href=\"https:\/\/twitter.com\/discordapp\/status\/897170310348263426\" target=\"_blank\" data-reactid=\"365\"><!-- react-text: 366 -->announced<!-- \/react-text --><\/a><!-- react-text: 367 --> in a tweet.<!-- \/react-text --><\/p>\n<p data-reactid=\"370\"><!-- react-text: 371 -->But playing hot potato with the web\u2019s ugliest URLs can only do so much to curb the resurgence of white supremacy in America. After all, the internet has no shortage of white-nationalist sites. Squarespace, for instance, <!-- \/react-text --><a href=\"http:\/\/www.whoishostingthis.com\/?q=npiamerica.org\" target=\"_blank\" data-reactid=\"372\"><!-- react-text: 373 -->hosts<!-- \/react-text --><\/a><!-- react-text: 374 --> Richard Spencer\u2019s National Policy Institute site. (Squarespace didn\u2019t respond to WIRED\u2019s request for comment.)<!-- \/react-text --><\/p>\n<p data-reactid=\"375\"><!-- react-text: 376 -->Even if tech companies were to proactively identify white supremacy as unacceptable content, catching every bad actor would require enormous resources. Some experts advocate giving the government a role in regulating these platforms, the same way it has regulated television broadcasters over the years.<!-- \/react-text --><\/p>\n<p data-reactid=\"377\"><!-- react-text: 378 -->\u201cThe history of the US says that if you have an entity that shapes public opinion like a broadcaster, you were subject to a higher level of scrutiny,\u201d says Nicco Mele, director of Harvard\u2019s Shorenstein Center on Media, Politics, and Public Policy. \u201cIf the platforms have the power to shape public opinion, it would be astonishingly un-American and counter to the history of the country not to look at what appropriate regulation looks like.\u201d<!-- \/react-text --><\/p>\n<p data-reactid=\"379\"><!-- react-text: 380 -->Given Silicon Valley\u2019s historic rejection of regulation, it\u2019s unlikely they\u2019d accept Washington intervention without a fight. But the country at large shouldn\u2019t accept the status quo without one either. Indeed, the biggest barrier to tamping down the kind of hate speech that leads to violence isn&#x27;t technological. It&#x27;s simply making a choice.<!-- \/react-text --><\/p>\n<p class=\"related-cne-video-component__dek\" data-reactid=\"390\">Many fake news peddlers didn\u2019t care if Trump won or lost the election. They only wanted to pocket money. But the consequences of what they did shook the world. This is how it happened.<\/p>\n<p><a href=\"https:\/\/www.wired.com\/story\/charlottesville-social-media-hate-speech-online\" target=\"bwo\" >https:\/\/www.wired.com\/category\/security\/feed\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/5991f76ea723bd58acbfd2aa\/master\/pass\/AltRightHP-831134374.jpg\"\/><\/p>\n<p><strong>Credit to Author: Issie Lapowsky| Date: Mon, 14 Aug 2017 22:24:55 +0000<\/strong><\/p>\n<p>After Charlottesville, companies like Facebook, Twitter, and the rest of Silicon [Valley](https:\/\/www.washingtonpost.com\/news\/morning-mix\/wp\/2017\/08\/14\/godaddy-bans-neo-nazi-site-daily-stormer-for-disparaging-woman-killed-at-charlottesville-rally\/?utm_term=.5071829edb00) should take a firmer stand against white supremacy on their platforms.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10378,10607],"tags":[714],"class_list":["post-8730","post","type-post","status-publish","format-standard","hentry","category-security","category-wired","tag-security"],"_links":{"self":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/8730","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=8730"}],"version-history":[{"count":0,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/8730\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=8730"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=8730"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=8730"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}