{"id":18623,"date":"2022-03-30T02:10:05","date_gmt":"2022-03-30T10:10:05","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2022\/03\/30\/news-12356\/"},"modified":"2022-03-30T02:10:05","modified_gmt":"2022-03-30T10:10:05","slug":"news-12356","status":"publish","type":"post","link":"https:\/\/www.palada.net\/index.php\/2022\/03\/30\/news-12356\/","title":{"rendered":"Watch out for LinkedIn fakes who want to get connected"},"content":{"rendered":"<p><strong>Credit to Author: Christopher Boyd| Date: Wed, 30 Mar 2022 09:38:37 +0000<\/strong><\/p>\n<p>Despite continued warnings of deepfake chaos during major events, things haven\u2019t worked out the way some thought. Those video <a href=\"https:\/\/blog.malwarebytes.com\/social-engineering\/2022\/03\/deepfake-zelenskyy-video-surfaces-on-compromised-websites\/\">deepfakes are bad, and they remain bad<\/a>. Quite simply, nobody is fooled &#8211; or at least, nobody able to make a mistaken snap judgement in a way that matters.<\/p>\n<p>As much as we over dramatise their use in our heads, the video aspect of deepfaking has a long way to go to pull the proverbial wool over our eyes. But it&#8217;s a little bit harder to spot an AI-generated image, as you can see on sites such as <a href=\"https:\/\/this-person-does-not-exist.com\/en\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">This Person Does Not Exist<\/a>, and some people are using these fake images on social media.<\/p>\n<h2>When LinkedIn connections go wrong<\/h2>\n<p>Two Stanford University researchers, Ren\u00e9e DiResta and Josh Goldstein have found <a href=\"https:\/\/www.theregister.com\/2022\/03\/28\/ai_fake_linkedin_faces\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">more than 1,000<\/a> fake Linked In profiles using AI-generated faces<\/p>\n<p>The story begins with someone sending a message to an individual on LinkedIn. Nothing odd there, except the recipient happens to know their way around AI generated images.<\/p>\n<figure class=\"wp-block-embed is-type-rich is-provider-twitter wp-block-embed-twitter\">\n<div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"twitter-tweet\" data-width=\"550\" data-dnt=\"true\">\n<p lang=\"en\" dir=\"ltr\">Given that, it\u2019s funny this whole story came about bcs that Keenan Ramsey profile sent a LinkedIn message to <a href=\"https:\/\/twitter.com\/noUpside?ref_src=twsrc%5Etfw\">@noUpside<\/a> \u2014 one of the few people who CAN spot the telltale signs of an AI-generated image.<\/p>\n<p>&quot;The face jumped out at me as being fake,&quot; she says. <a href=\"https:\/\/t.co\/TyoBp2qxIP\">https:\/\/t.co\/TyoBp2qxIP<\/a> <a href=\"https:\/\/t.co\/wCnG62g5Kt\">pic.twitter.com\/wCnG62g5Kt<\/a><\/p>\n<p>&mdash; Shannon Bond (@shannonpareil) <a href=\"https:\/\/twitter.com\/shannonpareil\/status\/1508113234510835717?ref_src=twsrc%5Etfw\">March 27, 2022<\/a><\/p><\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script> <\/div>\n<\/figure>\n<p>The avatar attached to the profile did indeed turn out to be entirely fictitious; \u201cKeenan Ramsey\u201d does not exist. From there, the pretend people were unearthed doing their thing on LinkedIn.<\/p>\n<figure class=\"wp-block-embed is-type-rich is-provider-twitter wp-block-embed-twitter\">\n<div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"twitter-tweet\" data-width=\"550\" data-dnt=\"true\">\n<p lang=\"en\" dir=\"ltr\">Welcome to the latest LinkedIn marketing tactic: using AI-generated faces to drum up sales.<\/p>\n<p>A tool used to spread disinformation has come to the corporate world. Stanford\u2019s <a href=\"https:\/\/twitter.com\/noUpside?ref_src=twsrc%5Etfw\">@noUpside<\/a> &amp; <a href=\"https:\/\/twitter.com\/JoshAGoldstein?ref_src=twsrc%5Etfw\">@JoshAGoldstein<\/a> found &gt;1,000 profiles with seemingly fake faces: <a href=\"https:\/\/t.co\/TyoBp2qxIP\">https:\/\/t.co\/TyoBp2qxIP<\/a> <a href=\"https:\/\/t.co\/sOyRZKVe7n\">pic.twitter.com\/sOyRZKVe7n<\/a><\/p>\n<p>&mdash; Shannon Bond (@shannonpareil) <a href=\"https:\/\/twitter.com\/shannonpareil\/status\/1508111096653115393?ref_src=twsrc%5Etfw\">March 27, 2022<\/a><\/p><\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script> <\/div>\n<\/figure>\n<p>These \u201cemployees\u201d were tagged under various businesses, except those businesses said they didn&#8217;t authorize the use of computer generated profile imagery. The researchers digging into this discovered companies selling LinkedIn marketing services. They also offered bot\/avatar accounts, which is a no-no from LinkedIn\u2019s perspective.<\/p>\n<figure class=\"wp-block-embed is-type-rich is-provider-twitter wp-block-embed-twitter\">\n<div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"twitter-tweet\" data-width=\"550\" data-dnt=\"true\">\n<p lang=\"en\" dir=\"ltr\">Here\u2019s one: a company called LIA, based in India. For $300 a month, LIA customers can pick one &quot;AI-generated avatar&quot; from hundreds that are &quot;ready-to-use,&quot; according to its website, which was recently scrubbed of all information except its logo. <a href=\"https:\/\/t.co\/WXLFbE2Tcx\">pic.twitter.com\/WXLFbE2Tcx<\/a><\/p>\n<p>&mdash; Shannon Bond (@shannonpareil) <a href=\"https:\/\/twitter.com\/shannonpareil\/status\/1508111965113094145?ref_src=twsrc%5Etfw\">March 27, 2022<\/a><\/p><\/blockquote>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script> <\/div>\n<\/figure>\n<h2>The long tail of deepfake marketing<\/h2>\n<p>It\u2019s somewhat bizarre that people may be making money from selling web generated profile pictures to businesses that could just do it themselves given 10 seconds and a web browser. It\u2019s also bizarre that nobody at any point in this daisy-chain of fake people&#8217;s profiles seems to know exactly where, or how, or why any of this has been happening. Who is responsible? What are these accounts doing besides perhaps bolstering employee count numbers?<\/p>\n<p>A very good question.<\/p>\n<p>For now, it may be worth paying close attention to random messages and\/or connection requests on LinkedIn. Is the person at the other end who they claim to be, or a business-realm fakeout? It may be tricky to pin down a conclusive answer, but I&#8217;d definitely rather know just who is wanting to get inside my connections network&#8230;and why.<\/p>\n<p>The post <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\/privacy-2\/2022\/03\/watch-out-for-linkedin-fakes-who-want-to-get-connected\/\">Watch out for LinkedIn fakes who want to get connected<\/a> appeared first on <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\">Malwarebytes Labs<\/a>.<\/p>\n<p><a href=\"https:\/\/blog.malwarebytes.com\/privacy-2\/2022\/03\/watch-out-for-linkedin-fakes-who-want-to-get-connected\/\" target=\"bwo\" >https:\/\/blog.malwarebytes.com\/feed\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><strong>Credit to Author: Christopher Boyd| Date: Wed, 30 Mar 2022 09:38:37 +0000<\/strong><\/p>\n<p>We take a look at a collection of no fewer than 1,000 profiles on LinkedIn using AI generated deepfake images for profile pictures.<\/p>\n<p>The post <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\/privacy-2\/2022\/03\/watch-out-for-linkedin-fakes-who-want-to-get-connected\/\">Watch out for LinkedIn fakes who want to get connected<\/a> appeared first on <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\">Malwarebytes Labs<\/a>.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10488,10378],"tags":[10245,17608,11539,11448,12078,5897,25540],"class_list":["post-18623","post","type-post","status-publish","format-standard","hentry","category-malwarebytes","category-security","tag-ai","tag-deepfake","tag-fake","tag-linkedin","tag-marketing","tag-privacy","tag-profile"],"_links":{"self":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/18623","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=18623"}],"version-history":[{"count":0,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/18623\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=18623"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=18623"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=18623"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}