{"id":10990,"date":"2018-01-03T10:45:07","date_gmt":"2018-01-03T18:45:07","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2018\/01\/03\/news-4761\/"},"modified":"2018-01-03T10:45:07","modified_gmt":"2018-01-03T18:45:07","slug":"news-4761","status":"publish","type":"post","link":"http:\/\/www.palada.net\/index.php\/2018\/01\/03\/news-4761\/","title":{"rendered":"The Logan Paul &#8220;Suicide Forest&#8221; Video Should Be a Reckoning For YouTube"},"content":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/5a4c0152ec4a443e2268ba97\/master\/pass\/LoganPaul-FA-863045652.jpg\"\/><\/p>\n<p><strong>Credit to Author: Louise Matsakis| Date: Wed, 03 Jan 2018 06:38:25 +0000<\/strong><\/p>\n<p><span class=\"lede\">By the time <\/span>Logan Paul arrived at Aokigahara forest, colloquially known as Japan\u2019s \u201csuicide forest,\u201d the YouTube star had already confused Mount Fuji with the country Fiji. His over 15 million (mostly underage) subscribers like this sort of comedic aloofness\u2014it serves to make Paul more relatable.<\/p>\n<p>After hiking only a couple hundred yards into Aokigahara\u2014where over 247 people attempted to take their own lives in 2010 alone, according to police statistics cited in <em><a href=\"https:\/\/www.japantimes.co.jp\/life\/2011\/06\/26\/general\/inside-japans-suicide-forest\/#.WkuBJ1WnGUk\" target=\"_blank\">The Japan Times<\/a><\/em>\u2014Paul encountered a suicide victim\u2019s body hanging from a tree. Instead of turning the camera off, he continued filming, and later uploaded close-up shots of the corpse, with the person\u2019s face blurred out.<\/p>\n<p>\u201cDid we just find a dead person in the suicide forest?\u201d Paul said to the camera. \u201cThis was supposed to be a fun vlog.\u201d He went on to make several jokes about the victim, while wearing a large, fluffy green hat.<\/p>\n<p>Within a day, over 6.5 million people had viewed the footage, and Twitter flooded with outrage. Even though the video violated YouTube\u2019s community standards, it was Paul in the end who deleted it.<\/p>\n<p>\u201cI should have never posted the video, I should have put the cameras down,\u201d Paul said in a <a href=\"https:\/\/www.youtube.com\/watch?v=QwZT7T-TXT0\" target=\"_blank\">video<\/a> posted Tuesday, which followed an earlier written apology. \u201cI\u2019ve made a huge mistake, I don\u2019t expect to be forgiven.\u201d He didn\u2019t respond to two follow-up requests for comment.<\/p>\n<p>YouTube, which failed to do anything about Paul\u2019s video, has now found itself wrapped in another controversy over how and when it should police offensive and disturbing content on its platform\u2014and as importantly, the culture it foments that led to it. YouTube encourages stars like Paul to garner views by any means necessary, while largely deciding how and when to censor their videos behind closed doors.<\/p>\n<p>Before uploading the video, which was titled \u201cWe found a dead body in the Japanese Suicide Forest&#8230;\u201d Paul halfheartedly attempted to censor himself for his mostly tween viewers. He issued a warning at the beginning of the video, blurred the victim\u2019s face, and included the number of several suicide hotlines, including one in Japan. He also chose to demonetize the video, meaning he wouldn\u2019t make money from it. His efforts weren\u2019t enough.<\/p>\n<p>\u201cThe mechanisms that Logan Paul came up with fell flat,\u201d says Jessa Lingel, an assistant professor at the University of Pennsylvania\u2019s Annenberg School for Communication, where she studies digital culture. \u201cDespite them, you see a video that nonetheless is very disturbing. You have to ask yourself: Are those efforts really enough to frame this content in a way that\u2019s not just hollowly or superficially aware of damage, but that is meaningfully aware of damage?\u201d<\/p>\n<p>The video still included shots of a corpse, including the victim\u2019s blue-turned hands. At one point, Paul referred to the victim as \u201cit.\u201d One of the first things he said to the camera after the encounter was, \u201cThis is a first for me,\u201d turning the conversation back to himself.<\/p>\n<p>&#x27;Of course YouTube is absolutely complicit in these kinds of things.&#x27;<\/p>\n<p name=\"inset-left\" class=\"inset-left-component__el\">Sarah T. Roberts, UCLA<\/p>\n<p>There\u2019s no excuse for what Paul did. His video was disturbing and offensive to the victim, their family, and to those who have struggled with mental illness. But blaming the YouTube star alone seems insufficient. Both he, and his equally famous brother Jake Paul, earn their living from YouTube, a platform that rewards creators for being outrageous, and <a href=\"http:\/\/www.wired.co.uk\/article\/youtube-kids-moderation-google\" target=\"_blank\">often fails<\/a> to adequately police its own content.<\/p>\n<p>\u201cI think that any analysis that continues to focus on these incidents at the level of the content creator is only really covering part of the structural issues at play,\u201d says Sarah T. Roberts, an assistant professor of information studies at UCLA and an expert in internet culture and content moderation. \u201cOf course YouTube is absolutely complicit in these kinds of things, in the sense that their entire economic model, their entire model for revenue creation is created fundamentally on people like Logan Paul.\u201d<\/p>\n<p>YouTube takes 45 percent of the advertising money generated via Paul and every other creator\u2019s videos. According to <a href=\"https:\/\/socialblade.com\/youtube\/channel\/ucg8rbf3g2amx70yod8vqizg\" target=\"_blank\">SocialBlade<\/a>, an analytics company that tracks the estimated revenue of YouTube channels, Paul could make as much as 14 million dollars per year. While YouTube might not explicitly encourage Paul to pull ever-more insane stunts, it stands to benefit financially when he and creators like him gain millions of views off of outlandish episodes.<\/p>\n<p>\u201c[YouTube] knows for these people to maintain their following and gain new followers they have to keep pushing the boundaries of what is bearable,\u201d says Roberts.<\/p>\n<p>YouTube presents its platform as democratic; anyone can upload and contribute to it. But it simultaneously treats enormously popular creators like Paul differently, because they command such massive audiences. (Last year, the company even chose Paul to star in <em>The Thinning<\/em>, the first full-length thriller distributed via its streaming subscription service YouTube Red, as well as <em>Foursome<\/em>, a romantic comedy series also offered via the service.)<\/p>\n<p>\u201cThere\u2019s a fantasy that he\u2019s just a dude with a GoPro on a stick,\u201d says Roberts. &quot;You have to actually examine the motivations of the platform.\u201d<\/p>\n<p>For example, major YouTube creators I have spoken to in the past said they often work with a representative from the company who helps them navigate the platform, a luxury not afforded to the average person posting cat videos. YouTube didn\u2019t respond to a follow-up request about whether Paul had a rep assigned to his channel.<\/p>\n<p>It\u2019s unclear why exactly YouTube let the video stay up so long; it may have be the result of the platform\u2019s murky <a href=\"https:\/\/www.youtube.com\/yt\/about\/policies\/#community-guidelines\" target=\"_blank\">community guidelines<\/a>. YouTube\u2019s comment on it doesn\u2019t shed much light either.<\/p>\n<p>\u201cOur hearts go out to the family of the person featured in the video. YouTube prohibits violent or gory content posted in a shocking, sensational or disrespectful manner. If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases it will be age-gated,\u201d a Google spokesperson said in an emailed statement. \u201cWe partner with safety groups such as the National Suicide Prevention Lifeline to provide educational resources that are incorporated in our YouTube Safety Center.\u201d<\/p>\n<p>YouTube may have initially decided that Paul\u2019s video didn\u2019t violate its policy on violent and graphic content. But those guidelines only consists of a few short sentences, making it impossible to know.<\/p>\n<p>\u201cThe policy is vague, and requires a bunch of value judgements on the part of the censor,\u201d says Kyle Langvardt, an associate law professor at the University of Detroit Mercy Law School and an expert on First Amendment and internet law. \u201cBasically, this policy reads well as an editorial guideline\u2026 But it reads terribly as a law, or even a pseudo-law. Part of the problem is the vagueness.\u201d<\/p>\n<p>What might constitute a meaningful step toward transparency would be for YouTube to implement a moderation or edit log, says Lingel. On it, YouTube could theoretically disclose what team screened a video and when. If the moderators choose to remove or age-restrict a video, the log could disclose what community standard violation resulted in that decision. It could be modeled on something like Wikipedia\u2019s edit logs, which show all of the changes made to a specific page.<\/p>\n<p>\u201cWhen you flag content, you have no idea what happens in that process,\u201d Lingel says. \u201cThere\u2019s no reason we can\u2019t have that sort of visibility, to see that content has a history. The metadata exists, it\u2019s just not made visible to the average user.\u201d<\/p>\n<p>&#x27;Part of the problem is the vagueness.&#x27;<\/p>\n<p name=\"inset-left\" class=\"inset-left-component__el\">Kyle Langvardt, University of Detroit Mercy Law School<\/p>\n<p>Fundamentally, Lingel says, we need to rethink how we envision content moderation. Right now, when a YouTube user flags a video as inappropriate, it\u2019s often left to a low-wage worker to tick a series of boxes, making sure it doesn\u2019t violate any community guidelines (YouTube <a href=\"https:\/\/www.theguardian.com\/technology\/2017\/dec\/04\/google-youtube-hire-moderators-child-abuse-videos\" target=\"_blank\">pledged<\/a> to expand its content moderation workforce to 10,000 people this year). The task is sometimes even left to an AI, that quietly combs through videos looking for inappropriate content or <a href=\"https:\/\/www.wired.com\/2016\/09\/googles-clever-plan-stop-aspiring-isis-recruits\/\">ISIS recruiting videos<\/a>. Either way, YouTube\u2019s moderation process is mostly anonymous, and conducted behind closed doors.<\/p>\n<p>It\u2019s helpful that the platform has baseline standards for what is considered appropriate; we can all agree that certain types of graphic content depicting violence and hate should be prohibited. But a positive step forward would be to develop a more transparent process, one centered around open discussion about what should and shouldn\u2019t be allowed, on something like a public moderation forum.<\/p>\n<p>Paul\u2019s video represents a potential turning point for YouTube, an opportunity to become more transparent about how it manages its own content. If it doesn\u2019t take the chance, scandals like this one will only continue to happen.<\/p>\n<p>As for the Paul brothers, they\u2019re likely going to keep making similarly outrageous and offensive videos to entertain their massive audience. On Monday afternoon, just hours after his brother Logan issued an apology for the suicide forest incident, Jake Paul uploaded a new video entitled \u201cI Lost My Virginity\u2026\u201d. At the time this story went live, it already had nearly two million views.<\/p>\n<p><em>If you or someone you know is considering suicide, help is available. You can call 1-800-273-8255 to speak with someone at the National Suicide Prevention Lifeline 24 hours a day in the United States. You can also text WARM to 741741 to message with the Crisis Text Line.<\/em><\/p>\n<p class=\"related-cne-video-component__dek\">Steve Stephens recorded himself murdering an innocent victim and then uploaded the footage to Facebook. The horrific act has put Facebook under immense pressure to do something, but can the company prevent broadcasting acts of violence without fundamentally changing the purpose of the social media platform. WIRED explores Facebook&#39;s limited options.<\/p>\n<p><a href=\"https:\/\/www.wired.com\/story\/logan-paul-video-youtube-reckoning\" target=\"bwo\" >https:\/\/www.wired.com\/category\/security\/feed\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/5a4c0152ec4a443e2268ba97\/master\/pass\/LoganPaul-FA-863045652.jpg\"\/><\/p>\n<p><strong>Credit to Author: Louise Matsakis| Date: Wed, 03 Jan 2018 06:38:25 +0000<\/strong><\/p>\n<p>Logan Paul&#8217;s video of Japan&#8217;s &#8220;suicide forest&#8221; was a nadir for the YouTube star\u2014and the platform that enables him.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10378,10607],"tags":[714],"class_list":["post-10990","post","type-post","status-publish","format-standard","hentry","category-security","category-wired","tag-security"],"_links":{"self":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/10990","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=10990"}],"version-history":[{"count":0,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/10990\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=10990"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=10990"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=10990"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}