{"id":7724,"date":"2017-05-22T06:31:32","date_gmt":"2017-05-22T14:31:32","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2017\/05\/22\/news-1509\/"},"modified":"2017-05-22T06:31:32","modified_gmt":"2017-05-22T14:31:32","slug":"news-1509","status":"publish","type":"post","link":"http:\/\/www.palada.net\/index.php\/2017\/05\/22\/news-1509\/","title":{"rendered":"Leak: Secret Facebook rules on what violence, self-harm and child abuse can be posted"},"content":{"rendered":"<p><img decoding=\"async\" src=\"http:\/\/zapt4.staticworld.net\/images\/article\/2017\/05\/facebook-billions-users_primary-100721420-large.3x2.jpg\"\/><\/p>\n<p><strong>Credit to Author: Darlene Storm| Date: Mon, 22 May 2017 06:18:00 -0700<\/strong><\/p>\n<p>Facebook allows users to livestream self-harm, post videos of violent deaths and photos of non-sexual child abuse, but comments which threaten to harm President Donald Trump are to be deleted, according to Facebook\u2019s secret rule books for monitoring what its 2 billion users can post.<\/p>\n<p><a href=\"https:\/\/www.theguardian.com\/news\/2017\/may\/21\/revealed-facebook-internal-rulebook-sex-terrorism-violence\" target=\"_blank\">The Guardian<\/a> got hold of leaked copies of over 100 internal Facebook manuals and documents that tell moderators how to handle content which includes violence, sex, hate speech, terrorism, nudity, self-harm, revenge porn and more controversial content \u2013 even cannibalism.<\/p>\n<p>The giant social network has increasingly come under fire for how it handles disturbing content and for depending too heavily on users to report such content. At the beginning of May, Facebook CEO Mark Zuckerberg <a href=\"https:\/\/www.facebook.com\/zuck\/posts\/10103695315624661\" target=\"_blank\">announced<\/a>\u00a0the company would hire 3,000 more people \u2013 on top of the 4,500 moderators it had \u2013 \u201cto review the millions of reports we get every week.\u201d<\/p>\n<p>The leaked internal guidelines were given to Facebook moderators \u201cwithin the last year,\u201d the Guardian said. The documents show the fine line Facebook teeters on when deciding what content to censor without being accused of squashing free speech.<\/p>\n<p>10 seconds\u2026that\u2019s about how long Facebook moderators have to decide if content should be removed, according to the Guardian. The internal manuals for moderators give examples of what to censor when it comes to <a href=\"https:\/\/www.theguardian.com\/news\/gallery\/2017\/may\/21\/facebooks-internal-guidance-on-showing-graphic-violence\" target=\"_blank\">graphic violence<\/a>, <a href=\"https:\/\/www.theguardian.com\/news\/gallery\/2017\/may\/21\/facebook-rules-on-showing-cruelty-to-animals\" target=\"_blank\">animal abuse<\/a>, <a href=\"https:\/\/www.theguardian.com\/news\/gallery\/2017\/may\/21\/facebooks-manual-on-credible-threats-of-violence\" target=\"_blank\">credible threats of violence<\/a>, <a href=\"https:\/\/www.theguardian.com\/news\/gallery\/2017\/may\/21\/facebooks-internal-manual-on-non-sexual-child-abuse-content\" target=\"_blank\">non-sexual child abuse<\/a> and more.<\/p>\n<p><strong>Credible threats of violence<\/strong><\/p>\n<p>Leaked documents show that the following call for violent action is allowed: \u201cTo snap a b*tch\u2019s neck, make sure to apply all your pressure to the middle of her throat.\u201d But commenting \u201cSomeone shoot Trump\u201d is not and should be deleted since he is a head of state.<\/p>\n<p><strong>Self-harm <\/strong><\/p>\n<p>Facebook, which purportedly has received <a href=\"https:\/\/www.theguardian.com\/news\/2017\/may\/21\/facebook-users-livestream-self-harm-leaked-documents\" target=\"_blank\">over 5,000 reports<\/a> of potential self-harm in a two-week period, says it is OK for users to livestream attempts to self-harm. According to an internal policy update, moderators were told: \u201cWe\u2019re now seeing more video content \u2013 including suicides \u2013 shared on Facebook. We don\u2019t want to censor or punish people in distress who are attempting suicide.\u201d<\/p>\n<p>However, Facebook will try to get other agencies to do a \u201cwelfare check\u201d when a person is attempting suicide. Once there is no chance of helping that person anymore, the video is removed.<\/p>\n<p><strong>Graphic violence<\/strong><\/p>\n<p>Videos of violent deaths help create awareness, Facebook believes. The footage should be marked as disturbing and \u201chidden from minors,\u201d but not automatically deleted since the videos can \u201cbe valuable in creating awareness for self-harm afflictions and mental illness or war crimes and other important issues.\u201d<\/p>\n<p>Images of animal abuse are allowed for awareness purposes, but \u201cextremely disturbing\u201d photos of animal mutilations and videos of torturing animals are to be marked as disturbing. If the violence against animals is sadistic or celebratory, then it is not allowed and is deleted.<\/p>\n<p><strong>Child abuse<\/strong><\/p>\n<p>Facebook allows videos of child abuse to be posted, as long as it is non-sexual and marked as \u201cdisturbing.\u201d Videos or photos of child abuse which are shared with sadism and celebration are removed. Imagery of child abuse is allowed unless the child is naked.<\/p>\n<p><strong>Nudity<\/strong><\/p>\n<p>Nudity is allowed if it is a \u201cnewsworthy exceptions\u201d or if it is \u201chandmade art.\u201d Digitally created art showing sexual activity as well as revenge porn are not allowed. Facebook also allows videos of abortions as long as there is no nudity in the footage.<\/p>\n<p>Facebook won\u2019t confirm if the documents obtained by the Guardian are authentic, but Facebook released the following statement:<\/p>\n<p>Keeping people on Facebook safe is the most important thing we do. In addition to investing in more people, we\u2019re also building better tools to keep our community safe. We\u2019re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.<\/p>\n<p><a href=\"http:\/\/www.computerworld.com\/article\/3197551\/internet\/leak-secret-facebook-rules-on-what-violence-self-harm-and-child-abuse-can-be-posted.html#tk.rss_security\" target=\"bwo\" >http:\/\/www.computerworld.com\/category\/security\/index.rss<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><img decoding=\"async\" src=\"http:\/\/zapt4.staticworld.net\/images\/article\/2017\/05\/facebook-billions-users_primary-100721420-large.3x2.jpg\"\/><\/p>\n<p><strong>Credit to Author: Darlene Storm| Date: Mon, 22 May 2017 06:18:00 -0700<\/strong><\/p>\n<article>\n<section class=\"page\">\n<p>Facebook allows users to livestream self-harm, post videos of violent deaths and photos of non-sexual child abuse, but comments which threaten to harm President Donald Trump are to be deleted, according to Facebook\u2019s secret rule books for monitoring what its 2 billion users can post.<\/p>\n<p><a href=\"https:\/\/www.theguardian.com\/news\/2017\/may\/21\/revealed-facebook-internal-rulebook-sex-terrorism-violence\" target=\"_blank\">The Guardian<\/a> got hold of leaked copies of over 100 internal Facebook manuals and documents that tell moderators how to handle content which includes violence, sex, hate speech, terrorism, nudity, self-harm, revenge porn and more controversial content \u2013 even cannibalism.<\/p>\n<p class=\"jumpTag\"><a href=\"\/article\/3197551\/internet\/leak-secret-facebook-rules-on-what-violence-self-harm-and-child-abuse-can-be-posted.html#jump\">To read this article in full or to leave a comment, please click here<\/a><\/p>\n<\/section>\n<\/article>\n","protected":false},"author":4,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[11062,10643],"tags":[4314,714,1932],"class_list":["post-7724","post","type-post","status-publish","format-standard","hentry","category-computerworld","category-independent","tag-internet","tag-security","tag-social-media"],"_links":{"self":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/7724","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=7724"}],"version-history":[{"count":0,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/7724\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=7724"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=7724"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=7724"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}