{"id":19013,"date":"2022-05-11T10:45:22","date_gmt":"2022-05-11T18:45:22","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2022\/05\/11\/news-12746\/"},"modified":"2022-05-11T10:45:22","modified_gmt":"2022-05-11T18:45:22","slug":"news-12746","status":"publish","type":"post","link":"http:\/\/www.palada.net\/index.php\/2022\/05\/11\/news-12746\/","title":{"rendered":"The EU Wants Big Tech to Scan Your Private Chats for Child Abuse"},"content":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/627bd8eeb6048c47d506c6c1\/master\/pass\/Laser_Sec_GettyImages-1169691904.jpg\"\/><\/p>\n<p><strong>Credit to Author: Matt Burgess| Date: Wed, 11 May 2022 15:45:20 +0000<\/strong><\/p>\n<p class=\"BylineWrapper-iiTsTb hAGfXd byline bylines__byline\" data-testid=\"BylineWrapper\" itemprop=\"author\" itemtype=\"http:\/\/schema.org\/Person\"><span itemprop=\"name\" class=\"BylineNamesWrapper-dbkCxf erRIa-D\"><span data-testid=\"BylineName\" class=\"BylineName-cKXFOb UCAzg byline__name\"><a class=\"BaseWrap-sc-TURhJ BaseText-fFzBQt BaseLink-gZQqBA BylineLink-eZnyPI eTiIvU mEZDb fNdcwQ bKZMMS byline__name-link button\" href=\"\/author\/matt-burgess\">Matt Burgess<\/a><\/span><\/span><\/p>\n<p>To revist this article, visit My Profile, then <a href=\"\/account\/saved\">View saved stories<\/a>.<\/p>\n<p>To revist this article, visit My Profile, then <a href=\"\/account\/saved\">View saved stories<\/a>.<\/p>\n<p><span class=\"lead-in-text-callout\">All of your<\/span> <a href=\"https:\/\/www.wired.com\/story\/whatsapp-communities-feature\/\">WhatsApp photos<\/a>, <a href=\"https:\/\/www.wired.com\/story\/ios-14-privacy-security-features\/\">iMessage texts<\/a>, and Snapchat videos could be scanned to check for child sexual abuse images and videos under newly proposed European rules. The plans, experts warn, may undermine the <a href=\"https:\/\/www.wired.com\/story\/opinion-encryption-has-never-been-more-essential-or-threatened\/\">end-to-end encryption<\/a> that protects billions of messages sent every day and hamper people\u2019s online privacy.<\/p>\n<p class=\"paywall\">The European Commission today <a data-offer-url=\"https:\/\/ec.europa.eu\/commission\/presscorner\/detail\/en\/ip_22_2976\" class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/ec.europa.eu\/commission\/presscorner\/detail\/en\/ip_22_2976&quot;}\" href=\"https:\/\/ec.europa.eu\/commission\/presscorner\/detail\/en\/ip_22_2976\" rel=\"nofollow noopener\" target=\"_blank\">revealed<\/a> long-awaited proposals aimed at tackling the huge volumes of child sexual abuse material, also known as CSAM, uploaded to the web each year. The proposed law creates a new EU Centre to deal with child abuse content and introduces obligations for tech companies to \u201cdetect, report, block and remove\u201d CSAM from their platforms. The law, announced by Europe\u2019s commissioner for home affairs, Ylva Johansson, says tech companies have failed to voluntarily remove abuse content, and it has been <a data-offer-url=\"https:\/\/twitter.com\/INHOPE_PR?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor\" class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/twitter.com\/INHOPE_PR?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor&quot;}\" href=\"https:\/\/twitter.com\/INHOPE_PR?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor\" rel=\"nofollow noopener\" target=\"_blank\">welcomed<\/a> by child protection and safety groups.<\/p>\n<p class=\"paywall\">Under the plans, tech companies\u2014ranging from web hosting services to messaging platforms\u2014can be ordered to \u201cdetect\u201d both new and previously discovered CSAM, as well as potential instances of \u201cgrooming.\u201d The detection could take place in chat messages, files uploaded to online services, or on websites that host abusive material. The plans echo an effort by Apple last year to scan photos on people\u2019s iPhones for abusive content before it was uploaded to iCloud. Apple <a href=\"https:\/\/www.wired.com\/story\/apple-icloud-photo-scan-csam-pause-backlash\/\">paused its efforts after a widespread backlash<\/a>.<\/p>\n<p class=\"paywall\">If passed, the European legislation would require tech companies to conduct risk assessments for their services to assess the levels of CSAM on their platforms and their existing prevention measures. If necessary, regulators or courts may then issue \u201cdetection orders\u201d that say tech companies must start \u201cinstalling and operating technologies\u201d to detect CSAM. These detection orders would be issued for specific periods of time. The draft legislation doesn\u2019t specify what technologies must be installed or how they will operate\u2014these will be vetted by the new EU Centre\u2014but says they should be used even when end-to-end encryption is in place.<\/p>\n<p class=\"paywall\">The European proposal to scan people\u2019s messages has been met with frustration from civil rights groups and security experts, who say it\u2019s likely to undermine the end-to-end encryption that\u2019s become the default on messaging apps such as <a href=\"https:\/\/www.wired.com\/tag\/imessage\/\">iMessage<\/a>, <a href=\"https:\/\/www.wired.co.uk\/article\/whatsapp-tricks-encryption\">WhatsApp<\/a>, and <a href=\"https:\/\/www.wired.com\/story\/signal-tips-private-messaging-encryption\/\">Signal<\/a>. \u201cIncredibly disappointing to see a proposed EU regulation on the internet fail to protect end-to-end encryption,\u201d WhatsApp head Will Cathcart <a data-offer-url=\"https:\/\/twitter.com\/wcathcart\/status\/1524292160169779201?s=21&amp;t=0DBL3pU-6zIotnnjM8tJ7g\" class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/twitter.com\/wcathcart\/status\/1524292160169779201?s=21&amp;t=0DBL3pU-6zIotnnjM8tJ7g&quot;}\" href=\"https:\/\/twitter.com\/wcathcart\/status\/1524292160169779201?s=21&amp;t=0DBL3pU-6zIotnnjM8tJ7g\" rel=\"nofollow noopener\" target=\"_blank\">tweeted<\/a>. \u201cThis proposal would force companies to scan every person&#x27;s messages and put EU citizens&#x27; privacy and security at serious risk.\u201d Any system that weakens end-to-end encryption could be abused or expanded to look for other types of content, <a data-offer-url=\"https:\/\/twitter.com\/matthew_d_green\/status\/1524208402058067974\" class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/twitter.com\/matthew_d_green\/status\/1524208402058067974&quot;}\" href=\"https:\/\/twitter.com\/matthew_d_green\/status\/1524208402058067974\" rel=\"nofollow noopener\" target=\"_blank\">researchers say<\/a>.<\/p>\n<p class=\"paywall\">\u201cYou either have E2EE or you don\u2019t,\u201d says Alan Woodward, a cybersecurity professor from the University of Surrey. End-to-end encryption protects people\u2019s privacy and security by ensuring that only the sender and receiver of messages can see their content. For example, Meta, the owner of WhatsApp, doesn\u2019t have any way to read your messages or mine their contents for data. The EU\u2019s draft regulation says solutions shouldn\u2019t weaken encryption and says it includes safeguards to ensure this doesn\u2019t happen; however, it doesn\u2019t give specifics for how this would work.<\/p>\n<p class=\"paywall\">\u201cThat being so, there is only one logical solution: client-side scanning where the content is examined when it is decrypted on the user&#x27;s device for them to view\/read,\u201d Woodward says. Last year, Apple announced it would introduce client-side scanning\u2014scanning done on people\u2019s iPhones rather than Apple\u2019s servers\u2014to check photos for known CSAM being uploaded to iCloud. The move sparked protests from civil rights groups and even Edward Snowden about the potential for surveillance, leading <a href=\"https:\/\/www.wired.com\/story\/apple-icloud-photo-scan-csam-pause-backlash\/\">Apple to pause its plans a month after initially announcing them<\/a>. (Apple declined to comment for this story.)<\/p>\n<p class=\"paywall\">For tech companies, detecting CSAM on their platforms and scanning some communications is not new. Companies operating in the United States are required to report any CSAM they find or that is reported to them by users to the National Center for Missing and Exploited Children (NCMEC), a US-based nonprofit. More than <a data-offer-url=\"https:\/\/www.missingkids.org\/content\/dam\/missingkids\/pdfs\/2021-reports-by-esp.pdf\" class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/www.missingkids.org\/content\/dam\/missingkids\/pdfs\/2021-reports-by-esp.pdf&quot;}\" href=\"https:\/\/www.missingkids.org\/content\/dam\/missingkids\/pdfs\/2021-reports-by-esp.pdf\" rel=\"nofollow noopener\" target=\"_blank\">29 million reports<\/a>, containing 39 million images and 44 million videos, were made to NCMEC last year alone. Under the new EU rules, the EU Centre will receive CSAM reports from tech companies.<\/p>\n<p class=\"paywall\">\u201cA lot of companies are not doing the detection today,\u201d Johansson said in a press conference introducing the legislation. \u201cThis is not a proposal on encryption, this is a proposal on child sexual abuse material,\u201d Johansson said, adding that the law is \u201cnot about reading communication\u201d but detecting illegal abuse content.<\/p>\n<p class=\"paywall\">At the moment, tech companies find CSAM online in different ways. And the amount of CSAM found is increasing as tech companies get better at detecting and reporting abuse\u2014although <a data-offer-url=\"https:\/\/www.nytimes.com\/interactive\/2019\/11\/09\/us\/internet-child-sex-abuse.html?mtrref=undefined&amp;gwh=25FDCC899A8BE539135856EB902592AE&amp;gwt=pay&amp;assetType=PAYWALL\" class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/www.nytimes.com\/interactive\/2019\/11\/09\/us\/internet-child-sex-abuse.html?mtrref=undefined&amp;gwh=25FDCC899A8BE539135856EB902592AE&amp;gwt=pay&amp;assetType=PAYWALL&quot;}\" href=\"https:\/\/www.nytimes.com\/interactive\/2019\/11\/09\/us\/internet-child-sex-abuse.html?mtrref=undefined&amp;gwh=25FDCC899A8BE539135856EB902592AE&amp;gwt=pay&amp;assetType=PAYWALL\" rel=\"nofollow noopener\" target=\"_blank\">some are much better than others<\/a>. In some cases, <a data-offer-url=\"https:\/\/blog.google\/around-the-globe\/google-europe\/using-ai-help-organizations-detect-and-report-child-sexual-abuse-material-online\/\" class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/blog.google\/around-the-globe\/google-europe\/using-ai-help-organizations-detect-and-report-child-sexual-abuse-material-online\/&quot;}\" href=\"https:\/\/blog.google\/around-the-globe\/google-europe\/using-ai-help-organizations-detect-and-report-child-sexual-abuse-material-online\/\" rel=\"nofollow noopener\" target=\"_blank\">AI is being used to hunt down previously unseen CSAM<\/a>. Duplicates of existing abuse photos and videos can be detected using \u201chashing systems,\u201d where abuse content is assigned a fingerprint that can be spotted when it\u2019s uploaded to the web again. More than 200 companies, from Google to Apple, use <a data-offer-url=\"https:\/\/www.microsoft.com\/en-us\/photodna\" class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/www.microsoft.com\/en-us\/photodna&quot;}\" href=\"https:\/\/www.microsoft.com\/en-us\/photodna\" rel=\"nofollow noopener\" target=\"_blank\">Microsoft&#x27;s PhotoDNA hashing system<\/a> to scan millions of files shared online. However, to do this, systems need to have access to the messages and files people are sending, which is not possible when end-to-end encryption is in place.<\/p>\n<p class=\"paywall\">\u201cIn addition to detecting CSAM, obligations will exist to detect the solicitation of children (\u2018grooming\u2019), which can only mean that conversations will need to be read 24\/7,\u201d says Diego Naranjo, head of policy at the civil liberties group European Digital Rights. \u201cThis is a disaster for confidentiality of communications. Companies will be asked (via detection orders) or incentivized (via risk mitigation measures) to offer less secure services for everyone if they want to comply with these obligations.\u201d<\/p>\n<p class=\"paywall\">Discussions about protecting children online and how this can be done with end-to-end encryption are hugely complex, technical, and combined with the horrors of the crimes against vulnerable young people. Research from Unicef, the UN\u2019s children\u2019s fund, <a data-offer-url=\"https:\/\/www.unicef-irc.org\/publications\/pdf\/Encryption_privacy_and_children%E2%80%99s_right_to_protection_from_harm.pdf\" class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/www.unicef-irc.org\/publications\/pdf\/Encryption_privacy_and_children%E2%80%99s_right_to_protection_from_harm.pdf&quot;}\" href=\"https:\/\/www.unicef-irc.org\/publications\/pdf\/Encryption_privacy_and_children%E2%80%99s_right_to_protection_from_harm.pdf\" rel=\"nofollow noopener\" target=\"_blank\">published in 2020<\/a> says encryption is needed to protect people\u2019s privacy\u2014including children\u2014but adds that it \u201cimpedes\u201d efforts to remove content and identify the people sharing it. For years, <a href=\"https:\/\/www.wired.com\/2016\/12\/year-encryption-won\/\">law enforcement agencies<\/a> around the world have pushed to create ways to bypass or weaken encryption. \u201cI\u2019m not saying privacy at any cost, and I think we can all agree child abuse is abhorrent,\u201d Woodward says, \u201cbut there needs to be a proper, public, dispassionate debate about whether the risks of what might emerge are worth the true effectiveness in fighting child abuse.\u201d<\/p>\n<p class=\"paywall\">Increasingly, researchers and tech companies have been focusing on safety tools that can exist alongside end-to-encryption. Proposals include <a href=\"https:\/\/www.wired.com\/story\/encrypted-messaging-privacy-security-metadata\/\">using metadata from encrypted messages<\/a>\u2014the who, how, what, and why of messages, not their content\u2014to analyze people\u2019s behavior and potentially spot criminality. One recent report by the nonprofit Business for Social Responsibility, which was commissioned by Meta, found that end-to-end encryption is an <a href=\"https:\/\/www.wired.com\/story\/meta-end-to-end-encryption-bsr-report\/\">overwhelmingly positive force for upholding people&#x27;s human rights<\/a>. It suggested 45 recommendations for how encryption and safety can go together and not involve access to people\u2019s communications. When the report was published in April, Lindsey Andersen, BSR&#x27;s associate director for human rights, told WIRED: \u201cContrary to popular belief, there actually is a lot that can be done even without access to messages.\u201d<\/p>\n<p><a href=\"https:\/\/www.wired.com\/story\/europe-csam-scanning-law-chat-encryption\" target=\"bwo\" >https:\/\/www.wired.com\/category\/security\/feed\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/627bd8eeb6048c47d506c6c1\/master\/pass\/Laser_Sec_GettyImages-1169691904.jpg\"\/><\/p>\n<p><strong>Credit to Author: Matt Burgess| Date: Wed, 11 May 2022 15:45:20 +0000<\/strong><\/p>\n<p>Europe\u2019s proposed child protection laws could undermine end-to-end encryption for billions of people.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10378,10607],"tags":[714,21382],"class_list":["post-19013","post","type-post","status-publish","format-standard","hentry","category-security","category-wired","tag-security","tag-security-privacy"],"_links":{"self":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/19013","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=19013"}],"version-history":[{"count":0,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/19013\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=19013"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=19013"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=19013"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}