{"id":8589,"date":"2017-08-05T08:45:49","date_gmt":"2017-08-05T16:45:49","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2017\/08\/05\/news-2362\/"},"modified":"2017-08-05T08:45:49","modified_gmt":"2017-08-05T16:45:49","slug":"news-2362","status":"publish","type":"post","link":"https:\/\/www.palada.net\/index.php\/2017\/08\/05\/news-2362\/","title":{"rendered":"The Neural Net That Recreated \u2018Blade Runner\u2019 Has the Movie Stuck in Its Memory"},"content":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/video-images.vice.com\/articles\/5984dc5f6e42134c658ce16d\/lede\/1501879417027-Screen-Shot-2017-08-04-at-44224-PM.png\"\/><\/p>\n<p><strong>Credit to Author: Marissa Clifford| Date: Sat, 05 Aug 2017 16:00:00 +0000<\/strong><\/p>\n<p> Artist and machine learning engineer Terence Broad&#8217;s <a href=\"https:\/\/medium.com\/@Terrybroad\/autoencoding-blade-runner-88941213abbe\" target=\"_blank\">Auto-Encoding Blade Runner<\/a> is the project Philip K. Dick would have made if he were a scientist. <\/p>\n<p> In his presentation at SIGGRAPH 2017, a computer graphics and animation conference, Broad detailed how he trained a Convolutional Autoencoder\u2014a type of neural network\u2014to recognize patterns of data in <i> Blade Runner <\/i>and then reconstruct it, scene by scene. What results is an eerily-accurate full-length film that was so convincing that Warner Bros. <a href=\"https:\/\/www.vox.com\/2016\/6\/1\/11787262\/blade-runner-neural-network-encoding\" target=\"_blank\">issued a DMCA takedown<\/a> notice to Vimeo when Broad first uploaded the footage in 2016. <\/p>\n<p> Since then, <a href=\"https:\/\/www.reddit.com\/r\/scifi\/comments\/4m7rb2\/a_neural_network_watched_and_reconstructed_blade\/\" target=\"_blank\">many speculated that<\/a> Broad had found a loophole around copyright (it&#8217;s not). The real question behind Broad&#8217;s <i> Blade Runner<\/i> parallels the themes of Philip K Dick&#8217;s legendary novel <i> Do Androids Dream of Electric Sheep<\/i>: Where does one draw the line between human and machine, the real and the seemingly real? <\/p>\n<div data-iframely-id=\"SWVtt4U\" class=\"article__embed article__embed--iframely\">\n<div style=\"left: 0; width: 100%; height: 0; position: relative; padding-bottom: 56.2493%;\" data-iframely-smart-iframe=\"true\"><iframe  src= width=\"100%\" height=\"420\" frameborder=\"0\" ><\/iframe> <\/div>\n<\/div>\n<p> &#8220;I think the thing to understand about neural networks is that we don&#8217;t really know how they work,&#8221; Ruth West, SIGGRAPH&#8217;s chair of art papers, told me in an interview. &#8220;They&#8217;re black boxes, and they make these leaps that are kind of like the leaps we make internally. That&#8217;s what I think is powerfully evocative about Auto-encoding <i> Blade Runner<\/i>\u2014that synthetic leap that the neural network makes.&#8221; <\/p>\n<p> In the case of <i> Blade Runner<\/i>: Auto-encoded, some of the leaps made it into the final version of the film. For example, Broad&#8217;s network couldn&#8217;t recognize black screens. Instead, it output an amalgam of green images, creating a sort of <a href=\"https:\/\/en.wikipedia.org\/wiki\/Palimpsest\" target=\"_blank\">blown out palimpsest<\/a> of images\u2014or memories. Others could be massaged out through algorithms and repeated learning loops. <\/p>\n<p> As Chrissie Iles, curator of &#8220;Dreamland,&#8221; an exhibition at New York City&#8217;s Whitney Museum of Art in which <i> Blade Runner<\/i>: Auto-encoded appeared last fall, <a href=\"https:\/\/books.google.com\/books?id=b4pRDQAAQBAJ&#038;pg=PA120&#038;dq=the+cyborg+and+the+sensorium+chrissie+iles&#038;hl=en&#038;sa=X&#038;ved=0ahUKEwj2nsLN4bjVAhUK6mMKHRLYDj8Q6AEIKDAA#v=onepage&#038;q=the%20cyborg%20and%20the%20sensorium%20chrissie%20iles&#038;f=false\" target=\"_blank\">wrote<\/a>, Broad&#8217;s work relates to the &#8220;disembodied, post-humanized gaze, outsourced to machines.&#8221; <\/p>\n<p> For Broad, that&#8217;s right on the money. <\/p>\n<p> &#8220;For me personally,&#8221; Broad said during his presentation, &#8220;it&#8217;s an internal representation of a machine\u2026.We&#8217;re peeking into the gaze of an auto-encoder.&#8221; <\/p>\n<p> In the future, Broad is dedicated to making work that exposes and critiques social bias, using technology not to perfectly mimic human behaviors, but to call them out. For him, it&#8217;s not so much about replicating humanity, but about humanity understanding the machine. <\/p>\n<p> This is a theme replicated in the emerging art of generative cinema. By remixing existing cultural products, artists and scientists like Broad are able to critique both the medium and the message. <\/p>\n<p> But what happens when the <i> Blade Runner<\/i> auto-encoder watches other films? Broad tried it out. When shown another Philip K. Dick adaptation, <i> A Scanner Darkly<\/i>, and a Soviet Classic, <i> Man with a Movie Camera, <\/i>it could still recognize the composition of the frames, but it essentially transposed the aesthetic of <i> Blade Runner: <\/i>Auto-encoded. They were dimly-lit, plagued by visual noise, and dreamy. The auto-encoded versions clearly came from the same memory. <\/p>\n<p> <i> Blade Runner: Auto-encoded is part of The Barbican&#8217;s exhibition <\/i><a href=\"https:\/\/www.barbican.org.uk\/intotheunknown\/\" target=\"_blank\"><i> Into the Unknown: A Journey Through Science Fiction<\/i><\/a><i> up in London through September, and subsequently touring the world. <\/i><\/p>\n<p><a href=\"https:\/\/motherboard.vice.com\/en_us\/article\/9kaxpz\/the-neural-net-that-recreated-blade-runner-has-the-movie-stuck-in-its-memory\" target=\"bwo\" >https:\/\/motherboard.vice.com\/en_us\/rss<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/video-images.vice.com\/articles\/5984dc5f6e42134c658ce16d\/lede\/1501879417027-Screen-Shot-2017-08-04-at-44224-PM.png\"\/><\/p>\n<p><strong>Credit to Author: Marissa Clifford| Date: Sat, 05 Aug 2017 16:00:00 +0000<\/strong><\/p>\n<p>The AI that made \u2018Blade Runner: Auto-encoded\u2019 transposed the aesthetic of the movie onto other sci-fi classics.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10643,13328,10378],"tags":[13355,13357,1439,10634,13356],"class_list":["post-8589","post","type-post","status-publish","format-standard","hentry","category-independent","category-motherboard","category-security","tag-blade-runner","tag-blade-runner-auto-encoded","tag-movies","tag-science-fiction","tag-siggraph"],"_links":{"self":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/8589","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=8589"}],"version-history":[{"count":0,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/8589\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=8589"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=8589"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=8589"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}