{"id":23823,"date":"2024-01-30T06:10:12","date_gmt":"2024-01-30T14:10:12","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2024\/01\/30\/news-17553\/"},"modified":"2024-01-30T06:10:12","modified_gmt":"2024-01-30T14:10:12","slug":"news-17553","status":"publish","type":"post","link":"http:\/\/www.palada.net\/index.php\/2024\/01\/30\/news-17553\/","title":{"rendered":"Deepfake Taylor Swift images circulate online, politicians call for laws to ban deepfake creation"},"content":{"rendered":"\n<p>Deepfake images of Taylor Swift have really made some serious waves. Explicit images of the popstar, generated by Artificial Intelligence (AI) were posted on social media and Telegram. The images were viewed millions of times.<\/p>\n<p>The impact of the deepfake was enormous. Social media platform X (formerly known as Twitter) even blocked searches for Taylor Swift&#8217;s name, saying:<\/p>\n<blockquote class=\"wp-block-quote\">\n<p>\u201cThis is a temporary action and done with an abundance of caution as we prioritize safety on this issue.\u201d<\/p>\n<\/blockquote>\n<p>X&#8217;s policies say it <a href=\"https:\/\/help.twitter.com\/en\/rules-and-policies\/manipulated-media\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">explicitly prohibits<\/a> the sharing of &#8220;synthetic, manipulated, or out-of-context media that may deceive or confuse people and lead to harm&#8221;, as well as the posting of Non-Consensual Nudity (NCN) images. But apparently it was not easy to quickly remove the images and take actions against the accounts that were posting them.<\/p>\n<p>Searches for Taylor Swift and some related terms were also blocked on Instagram, instead displaying \u201cthe search terms used were sometimes associated with activities of dangerous organizations and individuals.\u201d<\/p>\n<p>The uproar about the fake images of the popstar was so loud that some politicians started calling for laws to prohibit the creation of deepfakes. While in many countries and some US states, the creation of deepfakes is prohibited, there are currently no federal laws against the sharing or creation of deepfake images.<\/p>\n<p>In 2020 we <a href=\"https:\/\/www.malwarebytes.com\/blog\/news\/2020\/01\/deepfakes-laws-and-proposals-flood-us\">discussed deepfake legislation in the United States<\/a>. In a rare example of legislative haste, roughly one dozen state and federal bills were introduced in 2019 to regulate deepfakes, mostly out of fear that they could upend democracy.<\/p>\n<p>Although it is doubtful that any law would have stopped the creation of the images, it might have blocked or dampened the rapid way in which the images were spread.<\/p>\n<p>However, deepfakes started as a new form of pornography and most of the deepfakes created and posted online today are still of a pornographic nature. They also disproportionally target women, which should make appropriate legislation a bigger priority than being able to recognize deepfakes.<\/p>\n<p>Like Adam Dodge, founder of the nonprofit End Technology-Enabled Abuse, or EndTAB, said a few years ago:<\/p>\n<blockquote class=\"wp-block-quote\">\n<p>\u201cThe reality is, when it comes to the battle against deepfakes, everybody is focused on detection, on debunking and unmasking a video as a deepfake. That doesn\u2019t help women, because the people watching those videos don\u2019t care that they\u2019re fake.\u201d<\/p>\n<\/blockquote>\n<p>Well-known women, like actresses and musicians, are particularly at risk of falling victim to this type of abuse.<\/p>\n<p>Taylor Swift herself is furious about the AI images circulating online and is considering legal action against the sick deepfake porn site hosting them.<\/p>\n<p>Taylor Swift has a legal team at her disposal, but if you are the victim of &#8220;revenge porn&#8221; or other forms of non-consensual nudity, you should know it\u2019s much easier to take down nonconsensual porn content than it used to be. A growing number of companies will voluntarily take down nonconsensual porn on their platforms, regardless of whether the victim owns the copyright.<\/p>\n<p>For step-by-step instructions on how to report and take down nonconsensual porn across multiple technology platforms including <a href=\"https:\/\/help.instagram.com\/489507671074566\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Instagram<\/a>, <a href=\"https:\/\/help.twitter.com\/en\/safety-and-security\/report-abusive-behavior\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">X (Twitter)<\/a>,\u00a0<a href=\"https:\/\/support.reddithelp.com\/hc\/en-us\/articles\/360058309512-How-do-I-report-a-post-or-comment\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Reddit<\/a>,\u00a0<a href=\"https:\/\/help.tumblr.com\/hc\/en-us\/articles\/226270628-Reporting-Content\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Tumblr<\/a>, <a href=\"https:\/\/support.google.com\/legal\/answer\/2463296\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Google<\/a>, <a href=\"https:\/\/www.facebook.com\/help\/1753719584844061\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Facebook<\/a>, and <a href=\"https:\/\/www.tiktok.com\/legal\/page\/global\/reporting-illegal-content\/en\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">TikTok<\/a>, you can use Cyber Civil Rights Initiative\u2019s new\u00a0<a href=\"http:\/\/perma.cc\/7RQ6-LJZL\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Online Removal Guide<\/a>.<\/p>\n<hr class=\"wp-block-separator has-text-color has-cyan-bluish-gray-color has-alpha-channel-opacity has-cyan-bluish-gray-background-color has-background is-style-wide\" \/>\n<p><strong>We don\u2019t just report on threats\u2014we remove them<\/strong><\/p>\n<p>Cybersecurity risks should never spread beyond a headline. Keep threats off your devices by&nbsp;<a href=\"https:\/\/www.malwarebytes.com\/for-home\">downloading Malwarebytes today<\/a>.<\/p>\n<p><a href=\"https:\/\/www.malwarebytes.com\/blog\/news\/2024\/01\/deepfake-taylor-swift-images-circulate-online-politicians-call-for-laws-to-ban-deepfake-creation\" target=\"bwo\" >https:\/\/blog.malwarebytes.com\/feed\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> Explicit deepfake images of Taylor Swift caused problems on social media and caused politicians to ask for more legislation <\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10488,10378],"tags":[17608,256,32,26699,16353,30798],"class_list":["post-23823","post","type-post","status-publish","format-standard","hentry","category-malwarebytes","category-security","tag-deepfake","tag-legislation","tag-news","tag-personal","tag-revenge-porn","tag-taylor-swiift"],"_links":{"self":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/23823","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=23823"}],"version-history":[{"count":0,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/23823\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=23823"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=23823"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=23823"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}