{"id":16722,"date":"2019-10-29T10:45:21","date_gmt":"2019-10-29T18:45:21","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2019\/10\/29\/news-10461\/"},"modified":"2019-10-29T10:45:21","modified_gmt":"2019-10-29T18:45:21","slug":"news-10461","status":"publish","type":"post","link":"http:\/\/www.palada.net\/index.php\/2019\/10\/29\/news-10461\/","title":{"rendered":"How to Keep Your Siri, Alexa, and Google Assistant Voice Recordings Private"},"content":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/5db786b2dc63930009ef005d\/master\/pass\/Security_Sirihomepod_916301872.jpg\"\/><\/p>\n<p><strong>Credit to Author: Lily Hay Newman| Date: Tue, 29 Oct 2019 16:59:17 +0000<\/strong><\/p>\n<p class=\"content-header__row content-header__dek\">Alexa, Siri, and Google Assistant now all give you ways to opt out of human transcription of your voice snippets. Do it.<\/p>\n<p>After months of revelations and apologies, all the major <a href=\"https:\/\/www.wired.com\/story\/google-assistant-human-transcription-privacy\/\">smart assistant makers have revamped<\/a> how they handle human review of audio snippets. Amazon Alexa, Google Assistant, Apple Siri, and Microsoft Cortana <a href=\"https:\/\/www.wired.com\/story\/whos-listening-talk-google-assistant\/\">were all using third-party contractors<\/a> to transcribe and vet recorded snippets, adding some human brain power to underlying machine learning algorithms. But the backlash over the lack of transparency spurred new customer controls. And with the release of Apple&#x27;s iOS 13.2 on Monday, you now have one more way to rein that data collection in.<\/p>\n<p>Even if Siri isn&#x27;t your smart assistant of choice, it&#x27;s still a good time to take stock of how you have things set up on whatever platform you use. Each service has its own mix of options and controls. Here&#x27;s how to take the human element out of Siri, Alexa, Google Assistant, and Cortana. And once that&#x27;s done, tell a friend to do the same.<\/p>\n<p><a href=\"https:\/\/www.wired.com\/story\/hey-apple-opt-out-is-useless\">Apple paused human review of Siri user audio snippets<\/a> at the beginning of August and <a class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/www.apple.com\/newsroom\/2019\/08\/improving-siris-privacy-protections\/&quot;}\" href=\"https:\/\/www.apple.com\/newsroom\/2019\/08\/improving-siris-privacy-protections\/\" rel=\"nofollow noopener noreferrer\" target=\"_blank\">published an apology<\/a> later that month for the lack of transparency. Now, almost three months later, human review resumes, but it&#x27;s opt-in and only with Apple employees rather than contractors.<\/p>\n<p>With iOS 13.2, your Siri snippets no longer get passed along for human review by default. Now Apple asks you whether you&#x27;d like to opt in during the iOS 13.2 setup flow. If you make a mistake and opt in when you didn&#x27;t mean to, or change your mind down the road, you can go to <strong>Settings &gt; Privacy &gt; Analytics &amp; Improvements &gt; Improve Siri &amp; Dictation<\/strong> and toggle the switch off.<\/p>\n<p>Crucially, even if you enable this data-sharing, you can now delete all the audio Apple has collected from you at any time. To do so go to <strong>Settings &gt; Siri &amp; Search &gt; Siri &amp; Dictation History &gt; Delete Siri &amp; Dictation History<\/strong> to wipe everything out.<\/p>\n<p>Following significant blowback, Amazon was the first smart assistant company to centralize and expand its user controls for voice recording retention. You can review the voice recordings Amazon has stored for your account by going to <strong>Settings &gt; Alexa Privacy<\/strong> in the Alexa app or through <a class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/www.amazon.com\/alexaprivacysettings&quot;}\" href=\"https:\/\/www.amazon.com\/alexaprivacysettings\" rel=\"nofollow noopener noreferrer\" target=\"_blank\">Amazon&#x27;s website<\/a>. There you can delete entries one by one, by date range, by device, or en masse. You can also delete recordings by device on the <a class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/www.amazon.com\/hz\/mycd\/myx&quot;}\" href=\"https:\/\/www.amazon.com\/hz\/mycd\/myx\" rel=\"nofollow noopener noreferrer\" target=\"_blank\">Manage Your Content and Devices<\/a> page. When you&#x27;re there, you can also enable voice deletion directly, creating a clean slate by saying &quot;Alexa, delete what I just said,&quot; or &quot;Alexa, delete everything I said today.&quot; To turn that on in the Alexa app or <a class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/www.amazon.com\/alexaprivacysettings&quot;}\" href=\"https:\/\/www.amazon.com\/alexaprivacysettings\" rel=\"nofollow noopener noreferrer\" target=\"_blank\">Amazon&#x27;s website<\/a> go to <strong>Settings &gt; Alexa Privacy &gt; Review Voice History<\/strong>.<\/p>\n<p>To opt out of sending your Alexa recordings for human review, go to <strong>Alexa Account<\/strong> in the Alexa app then <strong>Alexa Privacy &gt; Manage how your data improves Alexa<\/strong> and turn off <strong>Help Develop New Features and Use Messages to Improve Transcriptions<\/strong>.<\/p>\n<p>Keep in mind that these settings control only what Amazon retains, and don&#x27;t necessarily apply to third-party developers that may have collected your voice data through Alexa Skills.<\/p>\n<p>Google <a href=\"https:\/\/www.wired.com\/story\/google-assistant-human-transcription-privacy\/\">offers a number of ways<\/a> to stop audio snippet retention or delete recordings. <a class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/support.google.com\/websearch\/answer\/6030020?co=GENIE.Platform=Desktop&amp;hl=en&quot;}\" href=\"https:\/\/support.google.com\/websearch\/answer\/6030020?co=GENIE.Platform=Desktop&amp;hl=en\" rel=\"nofollow noopener noreferrer\" target=\"_blank\">This page<\/a> lays out the different flows for deleting or opting out in a desktop browser, on Android, or on iOS. To delete recording on desktop, open your <a class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/myaccount.google.com&quot;}\" href=\"https:\/\/myaccount.google.com\" rel=\"nofollow noopener noreferrer\" target=\"_blank\">Google account<\/a> and choose <strong>Data &amp; personalization<\/strong> in the left navigation panel. There, under Activity controls, choose <strong>Web &amp; App Activity<\/strong> and then <strong>Manage Activity<\/strong>. Here you can scroll through the list of entries\u2014those with an audio icon next to them include a recording and you can delete individual items one at a time. Or on this same page click the <strong>More<\/strong> hamburger menu in the upper right, choose <strong>Delete activity by<\/strong> and under <strong>Delete by date<\/strong> select <strong>All time<\/strong>. Then at the bottom choose <strong>Delete<\/strong>.<\/p>\n<p>To opt out of letting Google collect recordings in the first place\u2014which also means no human transcription\u2014open your <a class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/myaccount.google.com&quot;}\" href=\"https:\/\/myaccount.google.com\" rel=\"nofollow noopener noreferrer\" target=\"_blank\">Google account<\/a> and choose <strong>Data &amp; personalization<\/strong> in the left navigation panel. There, under Activity controls, choose <strong>Web &amp; App Activity<\/strong> and then make sure the box next to <strong>Include voice and audio recordings<\/strong> is unchecked.<\/p>\n<p>Unlike the other big smart assistant developers, Microsoft simply updated its Cortana privacy policy in August to further clarify that audio snippets may be transcribed and evaluated by human reviewers\u2014both Microsoft employees and contractors. The company never paused review or barred third-parties from accessing Cortana user data. As with its peers, Microsoft says that the data it does collect is anonymized. To manage or delete your audio recordings of interactions with Cortana, make sure you&#x27;re logged into your Microsoft account and then go to Microsoft&#x27;s <a class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;http:\/\/account.microsoft.com\/privacy&quot;}\" href=\"http:\/\/account.microsoft.com\/privacy\" rel=\"nofollow noopener noreferrer\" target=\"_blank\">Privacy Dashboard<\/a>.<\/p>\n<p>Regardless of which platform you use keep in mind that these expanded controls, while positive and necessary, don&#x27;t change the fundamental concept of smart assistants. These services run on devices that contain a microphone, and can be woken up to &quot;hear&quot; things you&#x27;re saying and process them on a faraway server. As with any internet-enabled technology\u2014but particularly one that involves a potentially live mic\u2014there are always going to be privacy considerations no matter how much control you have. Even Rick Osterloh, Google&#x27;s senior vice president of devices and services, <a class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/www.bbc.com\/news\/technology-50048144&quot;}\" href=\"https:\/\/www.bbc.com\/news\/technology-50048144\" rel=\"nofollow noopener noreferrer\" target=\"_blank\">warns houseguests<\/a> that he has a Google Home when they come in.<\/p>\n<p>If these devices are a helpful and delightful force in your life, that&#x27;s fine! Just take steps to protect your privacy and be like Rick: Always remember that a gadget might be listening.<\/p>\n<p><a href=\"https:\/\/www.wired.com\/story\/keep-siri-alexa-google-assistant-recordings-private\" target=\"bwo\" >https:\/\/www.wired.com\/category\/security\/feed\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/5db786b2dc63930009ef005d\/master\/pass\/Security_Sirihomepod_916301872.jpg\"\/><\/p>\n<p><strong>Credit to Author: Lily Hay Newman| Date: Tue, 29 Oct 2019 16:59:17 +0000<\/strong><\/p>\n<p>Alexa, Siri, and Google Assistant now all give you ways to opt out of human transcription of your voice snippets. Do it.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10378,10607],"tags":[22740,714,21382],"class_list":["post-16722","post","type-post","status-publish","format-standard","hentry","category-security","category-wired","tag-business-artificial-intelligence","tag-security","tag-security-privacy"],"_links":{"self":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/16722","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=16722"}],"version-history":[{"count":0,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/16722\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=16722"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=16722"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=16722"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}