{"id":16941,"date":"2019-11-20T09:32:37","date_gmt":"2019-11-20T17:32:37","guid":{"rendered":"https:\/\/www.palada.net\/index.php\/2019\/11\/20\/news-10678\/"},"modified":"2019-11-20T09:32:37","modified_gmt":"2019-11-20T17:32:37","slug":"news-10678","status":"publish","type":"post","link":"http:\/\/www.palada.net\/index.php\/2019\/11\/20\/news-10678\/","title":{"rendered":"Deepfakes and LinkedIn: malign interference campaigns"},"content":{"rendered":"<p><strong>Credit to Author: Christopher Boyd| Date: Wed, 20 Nov 2019 16:00:00 +0000<\/strong><\/p>\n<p>Deepfakes haven&#8217;t <em>quite<\/em> lost the power to surprise, but given their wholesale media saturation in the last year or so, there\u2019s a sneaking suspicion in some quarters that they may have missed the bus. When people throw a fake Boris Johnson or Jeremy Corbyn online these days, the response seems to be fairly split between \u201cWow, that\u2019s funny\u201d and <a href=\"https:\/\/twitter.com\/ajreid\/status\/1194169559751770112\" target=\"_blank\" rel=\"noopener noreferrer\">barely even amused<\/a>.<\/p>\n<p>You may well be more likely to chuckle at people thinking <a href=\"https:\/\/interestingengineering.com\/video\/this-parody-of-a-boston-dynamics-video-sees-the-robot-fight-back\" target=\"_blank\" rel=\"noopener noreferrer\">popular Boston Dynamics spoof<\/a> \u201cBosstown Dynamics\u201d videos are real\u2014but that&#8217;s exactly what cybercriminals are banking on, and where the real malicious potential of deepfakes may lie.<\/p>\n<p>What happens when a perfectly ordinary LinkedIn profile features a deepfake-generated image of a person who doesn&#8217;t exist? Everyone believes the lie.<\/p>\n<h3>Is the sky falling? Probably not.<\/h3>\n<p>The two main markets<span class=\"Apple-converted-space\">\u00a0<\/span>cornered by deepfakes at time of writing are fake pornography clips and a growing industry in digital effects, which are a reasonable imitation of low budget TV movies. In some cases, a homegrown effort has come along and <a href=\"https:\/\/io9.gizmodo.com\/this-video-uses-the-power-of-deepfakes-to-re-capture-th-1828907452\" target=\"_blank\" rel=\"noopener noreferrer\">fixed a botched Hollywood attempt at CGI wizardry<\/a>. Somehow, in an age of awful people paying for nude deepfakes of anyone they choose, and the possibility of oft-promised but still not materialised political shenanigans, the current ethical flashpoint is <a href=\"https:\/\/www.cnbc.com\/2019\/11\/12\/worldwide-xr-is-formed-to-bring-actors-like-james-dean-to-life-on-the-screen.html\" target=\"_blank\" rel=\"noopener noreferrer\">whether or not to bring James Dean back from the dead<\/a>.<\/p>\n<p>Despite this, the mashup of politics and technology continues to simmer away in the background. At this point, it is extremely unlikely you\u2019ll see some sort of huge world event (or even several small but significant ones) being impacted by fake clips of world leaders talking crazy\u2014they&#8217;ll be debunked almost instantly. That ship has sailed. That deepfakes came to prominense primarily via pornography subreddits and people sitting at home rather suggests they got the drop on anyone at a nation-state level.<\/p>\n<p>When it comes to deepfakes, I\u2019ve personally been of the \u201cIt\u2019s bad, but in social engineering terms, it\u2019s a lot of work for little gain\u201d persuasion. I certainly don\u2019t subscribe to the sky-is-about-to-cave-in model. The worst areas of deepfakery I tend to see are where it&#8217;s <a href=\"https:\/\/www.cnet.com\/news\/deepfakes-users-computers-hacked-to-mine-cryptocurrency\/\" target=\"_blank\" rel=\"noopener noreferrer\">used as a basis to push Bitcoin scams<\/a>. But that doesn&#8217;t mean there isn&#8217;t potential for worse.<\/p>\n<h3>LinkedIn, deepfakes, and malign influence campaigns<\/h3>\n<p>With this in mind, I was fascinated to see \u201c<a href=\"https:\/\/stratcomcoe.org\/role-deepfakes-malign-influence-campaigns\" target=\"_blank\" rel=\"noopener noreferrer\">The role of deepfakes in malign influence campaigns<\/a>\u201d published by StratCom in November, which primarily focused on the more reserved but potentially devastating form of deepfakes shenanigans. It\u2019s not fake Trump, it isn\u2019t pretend Boris Johnson declaring aliens are invading; it\u2019s background noise level interference designed to work its silent way up a chain of command.<\/p>\n<p>I was particularly taken by the comment that \u201cDoom and gloom\u201d assessments from experts had made way for a more moderate and skeptical approach. In other words, as the moment marketers, YouTube VFX fans, and others tried to pry deepfake tech away from pornography pushers, it became somewhat untenable to make big, splashy fakes with sinister intentions. Instead, the battle raged behind the scenes.<\/p>\n<p>And that\u2019s where <a href=\"https:\/\/apnews.com\/bc2f19097a4c4fffaa00de6770b8a60d\" target=\"_blank\" rel=\"noopener noreferrer\">Katie Jones<\/a> stepped up to the plate.<\/p>\n<h3>Who is Katie Jones?<\/h3>\n<p>In the grand scheme of things, nobody. Another fake account in a never-ending wave of fake accounts stretching through years of Facebook clones and Myspace troll \u201cbaking sessions\u201d where hundreds would be rolled out the door on the fly. The only key difference is that Katie\u2019s LinkedIn profile picture was a computer-generated work of fiction.<\/p>\n<p>The people \u201cKatie\u201d had connected to were a little inconsistent, but they did include an awful lot of people working in and around government, policy, academia, and\u2026uh\u2026a fridge freezer company. Not a bad Rolodex for international espionage.<\/p>\n<p>Nobody admitted talking to Katie, though this raises the question of whether anyone who fell for the ruse would hold up their hand after the event.<\/p>\n<p>While we can speculate on <em>why<\/em> the profile was created\u2014social engineering campaign, test run for nation-state spying (quickly abandoned once discovered, similar to many malware scams), or even just some sort of practical joke\u2014what really amuses me is the possibility that someone just <a href=\"http:\/\/www.whichfaceisreal.com\/index.php\">randomly selected a face from a site like this<\/a> and had no idea of the chaos that would follow.<\/p>\n<h3>Interview with a deepfake sleuth<\/h3>\n<p>Either way, here comes Munira Mustaffa, the counter-intelligence analyst who first discovered the LinkedIn deepfake sensation known as Katie. Mustafa took some time to explain to me how things played out:<\/p>\n<blockquote>\n<p>A contact of mine, a well-known British expert on Russia defence and military, was immediately suspicious about an attempted LinkedIn connection. He scanned her profile, and reverse searched her profile photo, which turned up zero results. He turned to me to ask me to look into her, and I, too, found nothing.<\/p>\n<p>This is unusual for someone claiming to be a Russia &amp; Eurasia Fellow for an organisation like Center for Strategic and International Studies (CSIS), because you would expect someone in her role to have some publication history at least. The security world is a small one for us, especially if you\u2019re a policy wonk working on Russia matters. We both already knew Katie Jones did not exist, and this suspicion was confirmed when he checked with CSIS.<\/p>\n<p>I kept coming back to the photo. How could you have a shot like that but not have any sort of digital footprint? If it had been stolen from an online resource, it&#8217;d be almost impossible. At this point, I started to notice the abnormalities\u2014you must understand my thought process as someone who does photography for a hobby and uses Photoshop a lot.<\/p>\n<p>For one thing, there was a gaussian blur on her earlobe. Initially, I thought she&#8217;d Photoshopped her ear, but that didn&#8217;t check out. Why would someone Photoshop their earlobe?<span class=\"Apple-converted-space\">\u00a0<\/span><\/p>\n<p>Once I started to notice the anomalies, it was like everything suddenly started to click into place right before my eyes. I started to notice the halo around her hair strands. How her eyes were not aligned. The odd striations and blurring. Then there were casts and artefacts in the background. To casual observers, they would look like <a href=\"https:\/\/en.wikipedia.org\/wiki\/Bokeh\" target=\"_blank\" rel=\"noopener noreferrer\">bokeh<\/a>. But if you have some experiences doing photography, you would know instantly they were not bokeh.<\/p>\n<p>They looked pulled\u2014like someone had played with the Liquify tool on Photoshop but dialed up the brush to extreme. I immediately realised that what I was looking at was not a Photoshopped photo of a woman. In fact, it was an almost seamless blending of one person digitally composited and superimposed from different elements.<\/p>\n<p>I went on www.thispersondoesnotexist.com and started to generate my own deepfakes. After examining half a dozen or so, I started to picking out patterns and anomalies, and I went back to &#8220;Katie&#8221; to study it further. They were all present.<\/p>\n<\/blockquote>\n<h3>Does it really matter?<\/h3>\n<p>In some ways, possibly not. The only real benefit to using a deepfake profile pic is that suspicious people won\u2019t get a result in Google reverse search, TinEye, or any other similar service. But anyone doing that for LinkedIn connections or other points of contact probably won\u2019t be spilling the beans on anything they shouldn\u2019t be anyway.<\/p>\n<p>For everyone else, the risk is there and <i>just enough<\/i> to make it all convincing. It\u2019s always been pretty easy to spot someone using stock photography model shots for bogus profile pics. The threat from deepfaked snapshots comes from their sheer, complete and utter ordinariness. Using all that processing power and technology to carve what essentially looks like a non-remarkable human almost sounds revolutionary in its mundaneness.<\/p>\n<p>But ask any experienced social engineer, and they\u2019ll tell you mundane sells. We believe the reality that we\u2019re presented. You\u2019re more likely to tailgate your way into a building dressed as an engineer, or carrying three boxes and a coffee cup, then dressed as a clown or wearing an astonishingly overt spycoat and novelty glasses.<\/p>\n<h3>Spotting a fake<\/h3>\n<p>Once you spend a little time looking at the fake people generated on sites such as this, there are multiple telltale signifiers that the image has been digitally constructed. We go back to Mustaffa:<\/p>\n<blockquote>\n<p>Look for signs of tampering on the photo by starting with the background. If it appears to be somewhat neutral in appearance, then it&#8217;s time to look for odd noises\/disturbances like streaky hair or earlobes.<\/p>\n<\/blockquote>\n<p>I decided to fire up<span class=\"Apple-converted-space\">\u00a0<\/span>a site where you guess which of two faces is real and which is fake. In my first batch of shots, you\u2019ll notice the noise\/disturbance so common with AI-generated headshots\u2014it resembles the kind of liquid-looking smear effect you\u2019d get on old photographs you hadn\u2019t developed properly. Check out the neck in the below picture:<\/p>\n<p><a href=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake3.png\" data-rel=\"lightbox-0\" title=\"\"><img loading=\"lazy\" decoding=\"async\" data-attachment-id=\"41101\" data-permalink=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2019\/11\/deepfakes-and-linkedin-malign-interference-campaigns\/attachment\/fake3\/\" data-orig-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake3.png\" data-orig-size=\"419,262\" data-comments-opened=\"1\" data-image-meta=\"{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}\" data-image-title=\"Neck distortions\" data-image-description=\"\" data-medium-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake3-300x188.png\" data-large-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake3.png\" class=\"aligncenter size-medium wp-image-41101\" src=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake3-300x188.png\" alt=\"\" width=\"300\" height=\"188\" srcset=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake3-300x188.png 300w, https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake3.png 419w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a><\/p>\n<p>On a similar note, look at the warping next to the computer-generated man&#8217;s hairline:<\/p>\n<p><a href=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake7.png\" data-rel=\"lightbox-1\" title=\"\"><img loading=\"lazy\" decoding=\"async\" data-attachment-id=\"41102\" data-permalink=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2019\/11\/deepfakes-and-linkedin-malign-interference-campaigns\/attachment\/fake7\/\" data-orig-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake7.png\" data-orig-size=\"255,257\" data-comments-opened=\"1\" data-image-meta=\"{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}\" data-image-title=\"Head distortion\" data-image-description=\"\" data-medium-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake7.png\" data-large-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake7.png\" class=\"wp-image-41102 size-full aligncenter\" src=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake7.png\" alt=\"\" width=\"255\" height=\"257\" srcset=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake7.png 255w, https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake7-150x150.png 150w\" sizes=\"auto, (max-width: 255px) 100vw, 255px\" \/><\/a><\/p>\n<p>These effects also appear in backgrounds quite regularly. Look to the right of her ear:<\/p>\n<p><a href=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake2.png\" data-rel=\"lightbox-2\" title=\"\"><img loading=\"lazy\" decoding=\"async\" data-attachment-id=\"41103\" data-permalink=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2019\/11\/deepfakes-and-linkedin-malign-interference-campaigns\/attachment\/fake2\/\" data-orig-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake2.png\" data-orig-size=\"410,344\" data-comments-opened=\"1\" data-image-meta=\"{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}\" data-image-title=\"Background noise\" data-image-description=\"\" data-medium-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake2-300x252.png\" data-large-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake2.png\" class=\"aligncenter size-medium wp-image-41103\" src=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake2-300x252.png\" alt=\"\" width=\"300\" height=\"252\" srcset=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake2-300x252.png 300w, https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake2.png 410w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a><\/p>\n<p>Backgrounds are definitely a struggle for these images. Look at the bizarre furry effect running down the edge of this tree:<\/p>\n<p><a href=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake1.png\" data-rel=\"lightbox-3\" title=\"\"><img loading=\"lazy\" decoding=\"async\" data-attachment-id=\"41104\" data-permalink=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2019\/11\/deepfakes-and-linkedin-malign-interference-campaigns\/attachment\/fake1\/\" data-orig-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake1.png\" data-orig-size=\"243,276\" data-comments-opened=\"1\" data-image-meta=\"{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}\" data-image-title=\"Fake tree\" data-image-description=\"\" data-medium-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake1.png\" data-large-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake1.png\" class=\"aligncenter size-full wp-image-41104\" src=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake1.png\" alt=\"\" width=\"243\" height=\"276\" \/><\/a><\/p>\n<p>Sometimes the tech just can\u2019t handle what it\u2019s trying to do properly, and you end up with\u2026whatever that\u2019s supposed to be\u2026on the right:<\/p>\n<p><a href=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake4.png\" data-rel=\"lightbox-4\" title=\"\"><img loading=\"lazy\" decoding=\"async\" data-attachment-id=\"41105\" data-permalink=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2019\/11\/deepfakes-and-linkedin-malign-interference-campaigns\/attachment\/fake4\/\" data-orig-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake4.png\" data-orig-size=\"270,357\" data-comments-opened=\"1\" data-image-meta=\"{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}\" data-image-title=\"All eyes on me\" data-image-description=\"\" data-medium-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake4-227x300.png\" data-large-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake4.png\" class=\"aligncenter size-medium wp-image-41105\" src=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake4-227x300.png\" alt=\"\" width=\"227\" height=\"300\" srcset=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake4-227x300.png 227w, https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake4.png 270w\" sizes=\"auto, (max-width: 227px) 100vw, 227px\" \/><\/a><\/p>\n<p style=\"text-align: left\">Also of note are the sharply-defined lines on faces around the eyes and cheeks. Not always a giveaway, but helpful to observe alongside other errors.<\/p>\n<p>Remember in ye olden days when you\u2019d crank up certain sliders in image editing tools like sharpness to the max and end up with effects similar to the one on this ear?<\/p>\n<p><a href=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake5.png\" data-rel=\"lightbox-5\" title=\"\"><img loading=\"lazy\" decoding=\"async\" data-attachment-id=\"41106\" data-permalink=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2019\/11\/deepfakes-and-linkedin-malign-interference-campaigns\/attachment\/fake5\/\" data-orig-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake5.png\" data-orig-size=\"145,187\" data-comments-opened=\"1\" data-image-meta=\"{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}\" data-image-title=\"Fun with ears\" data-image-description=\"\" data-medium-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake5.png\" data-large-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake5.png\" class=\"aligncenter wp-image-41106 size-full\" src=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake5.png\" alt=\"\" width=\"145\" height=\"187\" \/><\/a><\/p>\n<p>Small children tend to cause problems, and so too do things involving folds of skin, especially where trying to make a fake person look a certain age is concerned. Another telltale sign you\u2019re dealing with a fake are small sets of incredibly straight vertical lines on or around the cheek or neck areas. Meanwhile, here are some entirely unconvincing baby folds:<\/p>\n<p><a href=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake6.png\" data-rel=\"lightbox-6\" title=\"\"><img loading=\"lazy\" decoding=\"async\" data-attachment-id=\"41107\" data-permalink=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2019\/11\/deepfakes-and-linkedin-malign-interference-campaigns\/attachment\/fake6\/\" data-orig-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake6.png\" data-orig-size=\"236,240\" data-comments-opened=\"1\" data-image-meta=\"{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}\" data-image-title=\"Neck folds\" data-image-description=\"\" data-medium-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake6.png\" data-large-file=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake6.png\" class=\"aligncenter size-full wp-image-41107\" src=\"https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/fake6.png\" alt=\"\" width=\"236\" height=\"240\" \/><\/a><\/p>\n<p>There are edge cases, but in my most recent non-scientific test on <a href=\"https:\/\/www.buzzfeednews.com\/article\/janelytvynenko\/real-or-fake-face-quiz\">Which face is real<\/a>, I was able to guess correctly no fewer than 50 times in a row who was real before I got bored and gave up. I once won 50 games of Tekken in a row at a university bar and let me tell you, that was an awful lot more difficult. Either I&#8217;m some sort of unstoppable deepfake-detecting marvel, or it really is quite easy to spot them with a bit of practice.<\/p>\n<h3>Weeding out the fakers<\/h3>\n<p>Deepfakes, then, are definitely here to stay. I suspect they\u2019ll continue to cause the most trouble in their familiar stomping grounds: fake porn clips of celebrities and paid clips of non celebrities that can also be used to threaten\/blackmail victims. Occasionally, we\u2019ll see another weightless robot turning on its human captors and some people will fall for it.<\/p>\n<p>Elsewhere, in connected networking profile land, we\u2019ll occasionally come across bogus profiles and then it\u2019s down to us to make use of all that OPSEC\/threat intel knowledge we\u2019ve built up to scrutinize the kind of roles we\u2019d expect to be targeted: government, policy, law enforcement, and the like.<\/p>\n<p>We can\u2019t get rid of them, and something else will be along soon enough to steal what thunder remains, but we absolutely shouldn\u2019t fear them. Instead, to lessen their potential impact, we need to train ourselves to spot the ordinary from the real.<\/p>\n<p><em>Thanks to Munira for her additional commentary.<\/em><\/p>\n<p>The post <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2019\/11\/deepfakes-and-linkedin-malign-interference-campaigns\/\">Deepfakes and LinkedIn: malign interference campaigns<\/a> appeared first on <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\">Malwarebytes Labs<\/a>.<\/p>\n<p><a href=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2019\/11\/deepfakes-and-linkedin-malign-interference-campaigns\/\" target=\"bwo\" >https:\/\/blog.malwarebytes.com\/feed\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><strong>Credit to Author: Christopher Boyd| Date: Wed, 20 Nov 2019 16:00:00 +0000<\/strong><\/p>\n<table cellpadding='10'>\n<tr>\n<td valign='top' align='center'><a href='https:\/\/blog.malwarebytes.com\/social-engineering\/2019\/11\/deepfakes-and-linkedin-malign-interference-campaigns\/' title='Deepfakes and LinkedIn: malign interference campaigns'><img src='https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/11\/shutterstock_1430571869.jpg' border='0'  width='300px'  \/><\/a><\/td>\n<\/tr>\n<tr>\n<td valign='top' align='left'>Don&#8217;t discount deepfakes just yet. We may not be fooled by phony Mark Zuckerberg anymore, but the discovery of a fake LinkedIn profile sporting a deepfake avatar shows how social engineering can deceive through the mundane.<\/p>\n<p>Categories: <\/p>\n<ul class=\"post-categories\">\n<li><a href=\"https:\/\/blog.malwarebytes.com\/category\/social-engineering\/\" rel=\"category tag\">Social engineering<\/a><\/li>\n<\/ul>\n<p>Tags: <a href=\"https:\/\/blog.malwarebytes.com\/tag\/ai\/\" rel=\"tag\">AI<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/computer-generated\/\" rel=\"tag\">computer generated<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/deepfakes\/\" rel=\"tag\">deepfakes<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/fakes\/\" rel=\"tag\">fakes<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/katie-jones\/\" rel=\"tag\">katie jones<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/linkedin\/\" rel=\"tag\">LinkedIn<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/munira\/\" rel=\"tag\">munira<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/social-engineering\/\" rel=\"tag\">Social Engineering<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/stratcom\/\" rel=\"tag\">stratcom<\/a><\/p>\n<table width='100%'>\n<tr>\n<td align=right>\n<p><b>(<a href='https:\/\/blog.malwarebytes.com\/social-engineering\/2019\/11\/deepfakes-and-linkedin-malign-interference-campaigns\/' title='Deepfakes and LinkedIn: malign interference campaigns'>Read more&#8230;<\/a>)<\/b><\/p>\n<\/td>\n<\/tr>\n<\/table>\n<\/td>\n<\/tr>\n<\/table>\n<p>The post <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2019\/11\/deepfakes-and-linkedin-malign-interference-campaigns\/\">Deepfakes and LinkedIn: malign interference campaigns<\/a> appeared first on <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\">Malwarebytes Labs<\/a>.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10488,10378],"tags":[10245,23492,17473,16219,23493,11448,23494,10510,23495],"class_list":["post-16941","post","type-post","status-publish","format-standard","hentry","category-malwarebytes","category-security","tag-ai","tag-computer-generated","tag-deepfakes","tag-fakes","tag-katie-jones","tag-linkedin","tag-munira","tag-social-engineering","tag-stratcom"],"_links":{"self":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/16941","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=16941"}],"version-history":[{"count":0,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/16941\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=16941"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=16941"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=16941"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}