{"id":18535,"date":"2022-03-18T07:10:11","date_gmt":"2022-03-18T15:10:11","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2022\/03\/18\/news-12268\/"},"modified":"2022-03-18T07:10:11","modified_gmt":"2022-03-18T15:10:11","slug":"news-12268","status":"publish","type":"post","link":"http:\/\/www.palada.net\/index.php\/2022\/03\/18\/news-12268\/","title":{"rendered":"Deepfake Zelenskyy video surfaces on compromised websites"},"content":{"rendered":"<p><strong>Credit to Author: Christopher Boyd| Date: Fri, 18 Mar 2022 15:04:56 +0000<\/strong><\/p>\n<p>It\u2019s been a long time coming. The worry over deepfake technology being used during times of major upheaval has been alluded to frequently over the last couple of years. The buildup to the US election was peppered by \u201cany moment now\u2026\u201d style warnings of dramatic and plausible deepfake deployment. In the end, what we got was <a href=\"https:\/\/blog.malwarebytes.com\/cybercrime\/2020\/10\/deepfakes-and-the-2020-united-states-election-missing-in-action\/\">very little to write home about<\/a>. Terrible renderings promoted as \u201clook what they can do\u201d declarations failed to impress.<\/p>\n<p>The current situation in Ukraine was inevitably going to lead to some form of deepfake activity. The only real questions were \u201cwhen?\u201d, and \u201chow bad will they be?\u201d As a matter of fact, deepfake activity began immediately. After the usual breathless punditry warning about what could happen, the \u201cbest\u201d mock-up example from February 23 was a frankly <a href=\"https:\/\/www.msn.com\/en-us\/news\/viral\/deepfake-videos-in-russia-ukraine-crisis-put-authorities-on-alert\/vi-AAUbkor\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">terrible clip<\/a> of Vladimir Putin.<\/p>\n<h2>Just how deep is this fake, anyway?<\/h2>\n<p>There were plenty of warnings about deepfakes too, but the reports were perhaps a little over the top. \u201cRussia is using deepfakes to spread misinformation against Ukraine, claims report\u201d. Well, that <a href=\"https:\/\/www.wionews.com\/world\/digital-weaponry-russia-using-deepfakes-to-spread-misinformation-against-ukraine-claims-report-458746\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">definitely sounds bad<\/a>. But how were those deepfakes spreading it?<\/p>\n<p>The fake in question wasn\u2019t sophisticated video or audio, which is what people reading across several sources may have believed. What actually happened was that an account spreading misinformation used a fake profile picture, the likes of which you can generate yourself, endlessly, <a href=\"https:\/\/this-person-does-not-exist.com\/en\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">here<\/a>. I\u2019d argue that the misinformation pushed would be exactly the same regardless of whether the profile used an AI-generated avatar, or the creator simply stole an image from somewhere else. How much does the profile picture really matter in this example?<\/p>\n<figure class=\"wp-block-embed is-type-rich is-provider-twitter wp-block-embed-twitter\">\n<div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"twitter-tweet\" data-width=\"550\" data-dnt=\"true\">\n<p lang=\"en\" dir=\"ltr\">Quick thread:<\/p>\n<p>I want you all to meet Vladimir Bondarenko.<\/p>\n<p>He\u2019s a blogger from Kiev who really hates the Ukrainian government.<\/p>\n<p>He also doesn\u2019t exist, according to Facebook.<\/p>\n<p>He\u2019s an invention of a Russian troll farm targeting Ukraine. His face was made by AI. <a href=\"https:\/\/t.co\/uWslj1Xnx3\">pic.twitter.com\/uWslj1Xnx3<\/a><\/p>\n<p>&mdash; Ben Collins (@oneunderscore__) <a href=\"https:\/\/twitter.com\/oneunderscore__\/status\/1498349668522201099?ref_src=twsrc%5Etfw\">February 28, 2022<\/a><\/p><\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script> <\/div>\n<\/figure>\n<h2>Deepfakes or cheapfakes?<\/h2>\n<p>Sure, it\u2019s good that the campaign(s) were <a href=\"https:\/\/www.nbcnews.com\/tech\/internet\/facebook-twitter-remove-disinformation-accounts-targeting-ukrainians-rcna17880\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">shut down<\/a>. Even so, would those most likely to believe the false information being promoted ever think to check to see if the profile picture was real or not? I\u2019d suggest only those hunting down misinformation would bother to analyse the image in the first place. Grabbing a fake AI picture simply takes less time than picking a stock photo image and reversing it to try and ward off reverse image searches.<\/p>\n<p>For the most part, what we have above is no different to tactics used in the malign interference campaign we <a href=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2019\/11\/deepfakes-and-linkedin-malign-interference-campaigns\/\">covered back in 2019<\/a>. Fake profile? Check. AI-generated headshots on the fly? Check. The fake headshot simply being present because somebody needed one in a hurry, without it being the primary reason for dubious activity? Almost certainly, check.<\/p>\n<p>Back in 2020, some people argued whether it was helpful to <a href=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2020\/07\/new-deepfakes-using-gan-digital-fakery\/\">tag static images as deepfakes<\/a>, and whether it was perhaps more useful to flag them as \u201ccheapfakes\u201d as a way to cleanly separate them from video content. I did wonder if we\u2019d see nothing but fake profiles with lazy image grabs this time around. However, someone actually has created a synthetic video with audio and then dropped it onto a hacked website as a way of exerting sudden influence over activities on the ground.<\/p>\n<p>How well did it do? The answer is \u201cnot very\u201d.<\/p>\n<h2>Zelenskyy deepfake aims for chaos<\/h2>\n<p>Make no mistake, this is it. The first big example I can think of during a crisis where a deepfake video\u2014<em>not<\/em> a profile picture\u2014has been used to deceive on a large scale, with potentially severe results.<\/p>\n<p>A broadcast by Ukraine\u2019s 24 TV channel was <a href=\"https:\/\/www.facebook.com\/www.ukraine24.ua\/posts\/1847515155441880\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">compromised<\/a>, and news tickers appeared to display messages from President Zelenskyy <a href=\"https:\/\/www.atlanticcouncil.org\/blogs\/new-atlanticist\/russian-war-report-hacked-news-program-and-deepfake-video-spread-false-zelenskyy-claims\/#deepfake\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">urging people to lay down weapons<\/a>.<\/p>\n<p>Meanwhile, a video was uploaded of President Zelenskyy to at least one compromised website, giving some sort of news conference. In it, he appears to be calling for Ukrainian troops to stop fighting, lay down their weapons, and so on.<\/p>\n<p>Thankfully, it wasn\u2019t very good.<\/p>\n<h2>Tearing down a fake<\/h2>\n<p>The Zelenskyy fake fell foul of many of the same problems that affected the <a href=\"https:\/\/www.msn.com\/en-us\/news\/viral\/deepfake-videos-in-russia-ukraine-crisis-put-authorities-on-alert\/vi-AAUbkor\" target=\"_blank\" rel=\"noreferrer noopener\">Putin fake<\/a>. The flat-looking and virtually unmoving body, the lack of convincing shadows, the head appearing to work independently of the neck, and the erratic angular movement of the head itself.<\/p>\n<figure class=\"wp-block-embed is-type-rich is-provider-twitter wp-block-embed-twitter\">\n<div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"twitter-tweet\" data-width=\"550\" data-dnt=\"true\">\n<p lang=\"en\" dir=\"ltr\">As a matter of principle, I never post or link to fake or false content. But <a href=\"https:\/\/twitter.com\/MikaelThalen?ref_src=twsrc%5Etfw\">@MikaelThalen<\/a> has helpfully whacked a label on this Zelensky one, so here goes.<\/p>\n<p>I&#39;ve seen some well-made deepfakes. This, however, has to rank among the worst of all time.<a href=\"https:\/\/t.co\/6OTjGxT28a\">pic.twitter.com\/6OTjGxT28a<\/a><\/p>\n<p>&mdash; Shayan Sardarizadeh (@Shayan86) <a href=\"https:\/\/twitter.com\/Shayan86\/status\/1504131692411432966?ref_src=twsrc%5Etfw\">March 16, 2022<\/a><\/p><\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script> <\/div>\n<\/figure>\n<p>Pay particular attention to the eyes. Even if you didn\u2019t notice anything off about anything else in the video, the eyes blink very unnaturally. Barely 3 seconds in, they snap shut and open again in a way that eyelids simply don\u2019t do. The more you look at it, the more the sensation of something being not quite right sets in.<\/p>\n<p>Note that it\u2019s also very unaligned with recent footage of Zelenskyy, in terms of how he records things. In most if not all of his recent announcements, everything is very natural, informal, off the cuff. This is trying for some sort of press release format which is another indicator that something may not be quite right here.<\/p>\n<h2>An early warning pays dividends<\/h2>\n<p>Interestingly, Ukraine had warned of the possibility of deepfake videos <a href=\"http:\/\/may-deploy-deepfake-videos\/507-4f18ea66-d5c7-4dac-9d41-2aea6495efdf?utm_campaign=snd-autopilot\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">just days earlier<\/a>. That the video didn\u2019t appear on any official communication channels almost certainly contributed to the general sensation of scepticism.<\/p>\n<p>Researchers quickly deduced that the deepfake is a <a href=\"https:\/\/www.verifythis.com\/article\/news\/verify\/world-verify\/ukraine-verify\/ukrainian-president-zelenskyy-surrender-video-is-deepfake\/536-2f445bce-8b60-492c-9e16-a98f39a76b04\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">composite of two separate images<\/a>. One for the face, and one for the background:<\/p>\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\">\n<div class=\"wp-block-embed__wrapper\"> <iframe loading=\"lazy\" class=\"youtube-player\" width=\"100%\" height=\"420\" src=\"https:\/\/www.youtube.com\/embed\/FEfwnXynLvA?version=3&#038;rel=1&#038;showsearch=0&#038;showinfo=1&#038;iv_load_policy=1&#038;fs=1&#038;hl=en-US&#038;autohide=2&#038;wmode=transparent\" allowfullscreen=\"true\" style=\"\" sandbox=\"allow-scripts allow-same-origin allow-popups allow-presentation\" frameborder=\"0\"><\/iframe> <\/div>\n<\/figure>\n<h2>A measure of success in failure<\/h2>\n<p>In terms of how successful this deepfake is? It depends what our baseline is for success, I suppose. The suspicion has been that for something like this to work, it has to be believable enough that it spreads widely during in the short window of opportunity it has to spread before official sources counter it. On those terms, it was a failure. Handed a big platform to spread and perform what could have been exceptionally chaotic activity, it just flopped right out of the gate.<\/p>\n<p>In terms of making some sort of impact, however? It caused Zelenskyy, in the middle of an invasion, to have to <a href=\"https:\/\/www.instagram.com\/p\/CbKYM_Zg2jo\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">make a recorded rebuttal<\/a>. Having him expend any sort of energy on this is quite remarkable so that\u2019s one thing the fakers may count as a win.<\/p>\n<p>The post <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2022\/03\/deepfake-zelenskyy-video-surfaces-on-compromised-websites\/\">Deepfake Zelenskyy video surfaces on compromised websites<\/a> appeared first on <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\">Malwarebytes Labs<\/a>.<\/p>\n<p><a href=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2022\/03\/deepfake-zelenskyy-video-surfaces-on-compromised-websites\/\" target=\"bwo\" >https:\/\/blog.malwarebytes.com\/feed\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><strong>Credit to Author: Christopher Boyd| Date: Fri, 18 Mar 2022 15:04:56 +0000<\/strong><\/p>\n<p>We look at a deepfake video claiming to be President Zelenskyy urging Ukrainians to lay down arms.<\/p>\n<p>The post <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2022\/03\/deepfake-zelenskyy-video-surfaces-on-compromised-websites\/\">Deepfake Zelenskyy video surfaces on compromised websites<\/a> appeared first on <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\">Malwarebytes Labs<\/a>.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10488,10378],"tags":[17608,25428,251,10510,8642,9053,25429],"class_list":["post-18535","post","type-post","status-publish","format-standard","hentry","category-malwarebytes","category-security","tag-deepfake","tag-footage","tag-russia","tag-social-engineering","tag-ukraine","tag-video","tag-zelenskyy"],"_links":{"self":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/18535","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=18535"}],"version-history":[{"count":0,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/18535\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=18535"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=18535"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=18535"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}