{"id":13742,"date":"2018-11-01T10:45:02","date_gmt":"2018-11-01T18:45:02","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2018\/11\/01\/news-7509\/"},"modified":"2018-11-01T10:45:02","modified_gmt":"2018-11-01T18:45:02","slug":"news-7509","status":"publish","type":"post","link":"https:\/\/www.palada.net\/index.php\/2018\/11\/01\/news-7509\/","title":{"rendered":"The Privacy Battle to Save Google From Itself"},"content":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/5bca63da6d53d208ff9b4d12\/master\/pass\/FISK18_Wired_Eye_texture-still.jpg\"\/><\/p>\n<p><strong>Credit to Author: Lily Hay Newman| Date: Thu, 01 Nov 2018 14:11:06 +0000<\/strong><\/p>\n<p><span class=\"lede\">Over two days <\/span>during the summer of 2009, experts from inside and outside Google met to forge a roadmap for how the company would approach user privacy. At the time, <a href=\"https:\/\/www.wired.com\/2009\/01\/ff-killgoogle\/\">Google was under fire<\/a> for its data collection practices and user tracking. The summit was designed to codify ways that users could feel more in control.<\/p>\n<p>Engineer Amanda Walker, then in her third year at Google and now the company\u2019s software engineering manager of privacy infrastructure, jotted down notes on a paper worksheet during one of the summit\u2019s sessions. \u201cHMW: Mitigate Impact of bad Gov\u2019t + 3rd party requests,\u201d she wrote, using shorthand for \u201chow might we.\u201d A few suggestions followed: \u201cDiscourage abusive requests. Make privacy measurable\/surface rising threats. Industry wide.\u201d It was the seed of what would eventually become Google\u2019s suite of transparency reports that, among other things, disclose government requests for data.<\/p>\n<p>It also was just one of several features the group brainstormed that summer that became a reality. An idea called \u201cPersona management\u201d became Chrome and Android profiles. \u201cUniversal preferences\u201d became My Account and My Activity. And \u201cPrivate search\u201d turned into controls to be able to see, pause, and delete search queries and other activity.<\/p>\n<p>Longtime Google employees remember the 2009 privacy summit as a turning point. \u201cA lot of these were a lot more work than we anticipated at the time, but it\u2019s reassuring to me that I think we got the big things right,\u201d Walker says.<\/p>\n<p>And yet, nearly a decade later, privacy controversies continue to plague Google. Just in recent months, <a href=\"https:\/\/www.apnews.com\/828aefab64d4411bac257a07c1af0ecb\" target=\"_blank\">the Associated Press revealed<\/a> that Google continued to store user location data on Android and iOS even when they paused collection in a privacy setting called Location History. At the end of September, Chrome had to <a href=\"https:\/\/www.wired.com\/story\/google-chrome-login-privacy\/\">walk back a change<\/a> to user logins meant to improve privacy on shared devices after the revision prompted a different set of concerns. Google then <a href=\"https:\/\/www.wired.com\/story\/googles-privacy-whiplash-shows-big-techs-inherent-contradictions\/\">shuttered Google+<\/a> in October, after <a href=\"https:\/\/www.wsj.com\/articles\/google-exposed-user-data-feared-repercussions-of-disclosing-to-public-1539017194\" target=\"_blank\"><em>The Wall Street Journal<\/em><\/a> reported on a previously undisclosed data exposure that left personal information from more than 500,000 of the social network\u2019s users out in the open. And Google is once again <a href=\"https:\/\/www.wired.com\/story\/wired-25-sundar-pichai-china-censored-search-engine\/\">building censored services for China<\/a>.<\/p>\n<p>In this seemingly unshakeable cycle of improvements and gaffes, it&#x27;s nearly impossible to make a full accounting of Google&#x27;s user privacy impacts and protections. But it&#x27;s critical to understand how the people on the front lines of that fight think about their jobs, and how it fits in with the fundamental truth of how Google makes money.<\/p>\n<p>Google\u2019s privacy apparatus\u2014which spans the globe and includes dedicated standalone teams, groups within other teams, and an extensive leadership structure\u2014comprises thousands of employees and billions of dollars in cumulative investment. More than a dozen Google employees who work on privacy at all levels talked with WIRED in recent weeks about the massive scale and scope of these efforts. Every employee\u2014from research scientists to engineers, program managers, and executives\u2014described a single shared goal: to respect Google users and help them understand and control their data as they generate it in real-time on Google\u2019s services.<\/p>\n<p>But Google is not a consumer software company, or even a search company. It\u2019s an ad company. It collects exhaustive data about its users in the service of brokering ad sales around the web. To do so, Google requires an extensive understanding of the backgrounds, browsing habits, preferences, purchases, and lives of as many web users as possible, gleaned through massive data aggregation and analysis. In third quarter earnings <a href=\"https:\/\/abc.xyz\/investor\/static\/pdf\/2018Q3_alphabet_earnings_release.pdf?cache=d17140f\" target=\"_blank\">announced<\/a> last week, Google\u2019s parent company Alphabet reported $33.7 billion in revenue. About 86 percent of that came from Google\u2019s ad business.<\/p>\n<p>\u201cGoogle does a good job of protecting your data from hackers, protecting you from phishing, making it easier to zero out your search history or go incognito,\u201d says Douglas Schmidt, a computer science researcher at Vanderbilt University who has <a href=\"https:\/\/digitalcontentnext.org\/wp-content\/uploads\/2018\/08\/DCN-Google-Data-Collection-Paper.pdf\" target=\"_blank\">studied<\/a> Google&#x27;s user data collection and retention policies. \u201cBut their business model is to collect as much data about you as possible and cross-correlate it so they can try to link your online persona with your offline persona. This tracking is just absolutely essential to their business. Surveillance capitalism is a perfect phrase for it.\u201d<\/p>\n<p>&quot;We saw and had to tackle these challenges years and years before most other people.&quot;<\/p>\n<p name=\"inset-left\" class=\"inset-left-component__el\">Lea Kissner, Google<\/p>\n<p>And yet Google has also played a major role in creating the superstructure of what corporate user data protections and transparency mechanisms look like today. Transparency reports have become a staple among tech giants, as have other user security and privacy features Google offered early, like tailored settings walkthroughs. And while Apple only recently <a href=\"https:\/\/www.reuters.com\/article\/us-apple-privacy\/apple-gives-u-s-users-tool-to-see-what-data-it-has-collected-idUSKCN1MR296\" target=\"_blank\">introduced<\/a> an option to download data\u2014prompted by Europe\u2019s GDPR omnibus privacy law\u2014Google launched its first such tool, known as Takeout, in 2011. The company also continues to improve and refine its options for user privacy controls. One recent move involves surfacing information about user data flow and settings options <a href=\"https:\/\/www.wired.com\/story\/google-your-data-search-privacy\/\">directly in the main screens<\/a> of search results, so users are actively prompted to consider these issues all the time.<\/p>\n<p>\u201cWe saw and had to tackle these challenges years and years before most other people,\u201d says Lea Kissner, Google\u2019s global lead of privacy technology, who has been at the company for more than 11 years and oversees the NightWatch privacy audit program. \u201cWhen I look back at where we were and how much we know now and how much we\u2019ve built, I\u2019m really proud of what we did, but you\u2019re never going to be done.\u201d<\/p>\n<p>Google\u2019s privacy-focused employees say they see no conflict between their work and the cash-generating side of the business, and that they don\u2019t feel pressure to pull punches.<\/p>\n<p>\u201cWe do a pretty good job of firewalling the ads business from the products we build,\u201d says Ben Smith, a Google fellow and vice president of engineering. \u201cBut ads do fund a whole lot of free services. When we talk about building for everyone we want to build for the people who can\u2019t afford an expensive phone and can\u2019t afford a $20 per month subscription. And I think that democratization of access to data is a good thing for society and the world.\u201d<\/p>\n<p>Google can afford to develop top-quality consumer products\u2014complete with expansive user security and abuse protections\u2014and offer them at no monetary cost to anyone who wants to use them worldwide. Not many companies can. Google also funds efforts to improve web performance, stability, and security that raise the bar for the internet at large. But whether all of this is \u201cfree\u201d is subject to debate. Google users pay for the services, in a very real sense, with their personal data.<\/p>\n<p>\u201cI think the big problem is that we give much more data to Google than it needs,\u201d says Guillaume Chaslot, a former Google engineer who worked on YouTube\u2019s recommendations algorithm and now runs the watchdog group AlgoTransparency. \u201cWhen something is free, we behave irrationally, and that\u2019s how users behave with Google. It makes no sense that Google keeps our data forever.\u201d<\/p>\n<p>But from a business perspective, it makes plenty of sense. \u201cWhen you depend on insight from data, well, you need the data,\u201d says Lukasz Olejnik, a security and privacy researcher and member of the W3C Technical Architecture Group.<\/p>\n<p><span class=\"lede\">Both current and <\/span>former Google privacy employees insist that there is no internal pressure to water down privacy protections.<\/p>\n<p>\u201cOne of the things that was really persistent at Google, and which was really hard to explain to outsiders, was just how committed everyone was to privacy,\u201d says Yonatan Zunger, a former senior privacy engineer at Google who left in mid-2017 to work on privacy engineering and data protection at the workplace behavior startup Humu. \u201cI pretty much never had to convince anyone of its importance.\u201d<\/p>\n<p>Google has also increasingly prioritized building in privacy protections for new services and features early in the development process. Led by Kissner, the effort has helped avoid tensions that arise when developers try to add protections when a deadline looms. Just how soon those privacy considerations kick in, though, is unclear. In a September Congressional hearing about a potential censored search engine for China, known as Project Dragonfly, Keith Enright, Google\u2019s chief privacy officer, <a href=\"https:\/\/www.wired.com\/story\/congress-google-project-dragonfly-questions\/\">testified<\/a> that his team was not yet involved in the project.<\/p>\n<p>&quot;It makes no sense that Google keeps our data forever.&quot;<\/p>\n<p name=\"inset-left\" class=\"inset-left-component__el\">Guillaume Chaslot, Former Google Engineer<\/p>\n<p>Meanwhile, Google has also devoted significant resources to developing its Security and Privacy Checkup tools, which walk users through a sort of explanatory checklist of how Google\u2019s data controls work and what options are available. The project has a special emphasis on developing privacy language that is actually understandable, and doing so for more than 15 languages that Google supports, so nothing is lost in translation. \u201cUsers are not the experts in privacy and security, it\u2019s actually Google,\u201d says Guemmy Kim, product management lead for Google Account security. \u201cGoogle should be telling users what\u2019s wrong, we should point out the anomalies, and guide users through their settings.\u201d<\/p>\n<p>And Google is often on the front lines of rigorous artificial intelligence, computer science, and digital privacy research, thanks to a deep bullpen of former academics who continue to publish under Google\u2019s auspices. Privacy research coming from inside Google potentially poses conflicts of interest\u2014you wouldn\u2019t hire a lion to research antelope safety. But academics, including those who have investigated privacy behaviors in Google services, say its research is well-regarded.<\/p>\n<p>\u201cI think their academic work on privacy is solid,\u201d says Gunes Acar, a postdoctoral researcher at Princeton, who studies digital data flow and overreach. \u201cPrivacy-related papers from Google researchers and engineers are published at top venues and are of top quality.\u201d<\/p>\n<p>In the past few years, for instance, Google researchers have helped develop machine learning techniques that can build models off of disparate data sets, so there never needs to be one centralized repository of the information. The mechanism, known as <a href=\"https:\/\/ai.googleblog.com\/2017\/04\/federated-learning-collaborative.html\" target=\"_blank\">federated learning<\/a>, allows Google (or anyone) to develop predictive algorithms locally on your device or any user devices without needing to remove it. This means that the models can train and mature on a collective data set contributed by millions of devices without sending the information to an entity\u2019s servers somewhere else.<\/p>\n<p>The technique dovetails in many ways with the concept of <a href=\"https:\/\/www.wired.com\/2016\/06\/apples-differential-privacy-collecting-data\/\">differential privacy<\/a>, the statistical process of analyzing data from a population without learning about individuals in it. Both are next-generation techniques that reduce the amount of personal user data an entity like Google holds, which has the added benefit of improving privacy defenses against criminal hackers, intelligence agencies, or other government intrusions.<\/p>\n<p>\u201cI was hired in the big buildup of security at Google about nine years ago with the explicit mandate of looking at new things that push the envelope,\u201d says \u00dalfar Erlingsson, a senior staff research scientist who heads work on improving machine learning algorithm protections. \u201cHaving worked in security and privacy for 25 years I know that there\u2019s usually not a good solution\u2014usually there\u2019s a bad solution and then we struggle a lot to make it work. But with machine learning we can train these machines in such a way that they truly don\u2019t capture any details about people.\u201d<\/p>\n<p>Google has also led on and expanded its work to produce transparency reports. The project has grown from an annual report on government requests launched in 2010 <a href=\"https:\/\/transparencyreport.google.com\" target=\"_blank\">into an array<\/a> of analyses and data sets for users to track over time on a range of issues like content removals due to copyright, YouTube community guidelines enforcement, search entry removals under European privacy law, and even a report about political advertising on Google. Michee Smith, the lead project manager for transparency reports, oversees a team of 10 to 15 engineers, product people, policy experts, and lawyers who work together to keep the reports coming and collaborate with various teams around Google to get the right data. The group prioritizes making its reports as easy as possible for people to understand and dig through.<\/p>\n<p>\u201cAs a company we\u2019re getting big, but we\u2019re not trying to get evil just because we\u2019re getting big,\u201d she says. \u201cWith these really important topics, we\u2019re putting data out there, so if you see a trend or you notice something you can hold us accountable. The average user is not aware of all the laws and policies that can impact the flow of information online, but we are. So my ultimate goal is for users to feel like we have your back.\u201d<\/p>\n<p><span class=\"lede\">And yet Google <\/span>regularly stumbles. Some of the company\u2019s issues fit in with broader revelations over the past couple of years that massive user platforms like Facebook have underestimated, or failed to consider, the fundamental impact their services\u2014and business priorities\u2014could have on world societies.<\/p>\n<p>\u201cGoogle is strong on having people with remarkable security and privacy expertise, but reconciling privacy guarantees with business needs is a challenging topic anywhere,\u201d independent researcher Olejnik says. \u201cA potential issue is underestimating the possible misuse of high-impact technologies like Google\u2019s Real-Time Bidding ads platform. I would argue that the risks could have been foreseen.\u201d Over the past few years, Google has been criticized, and <a href=\"https:\/\/marketingland.com\/american-brands-stop-advertising-google-youtube-extremist-209998\" target=\"_blank\">even boycotted<\/a>, for allowing <a href=\"https:\/\/www.wsj.com\/articles\/googles-youtube-has-continued-showing-brands-ads-with-racist-and-other-objectionable-videos-1490380551\" target=\"_blank\">inappropriate or problematic content<\/a> on its ad networks.<\/p>\n<p>In spite of more than a decade of industry-leading work on privacy from Google, some see the carousel of errors as proof of a sort of Google privacy <em>Groundhog Day<\/em>. But the company in many cases also created the technology that solves those same problems not just for itself, but the whole industry.<\/p>\n<p>Many of Google\u2019s critics also note that they believe it is possible\u2014at least from a technological perspective\u2014to develop user services that are funded by ads, but still silo and control data enough to balance user privacy with business interests.<\/p>\n<p>\u201cIt\u2019s entirely possible for a company like Google to make good, usable products that strike a balance between privacy and profit,\u201d Johns Hopkins cryptographer Matthew Green <a href=\"https:\/\/slate.com\/technology\/2018\/10\/google-is-losing-users-trust.html\" target=\"_blank\">wrote<\/a> at the beginning of October after publicly railing against a <a href=\"https:\/\/www.wired.com\/story\/google-chrome-login-privacy\/\">problematic change<\/a> to Chrome. \u201cIt\u2019s just that without some countervailing pressure forcing Google to hold up their end of the bargain, it\u2019s going to be increasingly hard for Google executives to justify it.\u201d<\/p>\n<p>&quot;They have the ability to change the trajectory here, but they don\u2019t allow for any idea that things could be a bit different.&quot;<\/p>\n<p name=\"inset-left\" class=\"inset-left-component__el\">Jason Kint, Digital Content Next<\/p>\n<p>Nearly everyone WIRED spoke to at Google for this story attributed the company\u2019s privacy mistakes and failures to Google\u2019s unique position at the forefront of encountering and dealing with unprecedented data flow challenges. \u201cGoogle, by virtue of what we do and the velocity that we do it at, we are necessarily the petri dish that privacy engineering is being cultivated in,\u201d says Google&#x27;s Enright. \u201cMost of our fumbles and missteps in my experience can be tracked to us leaning so far into our own optimism that we failed to benefit from the wisdom of others.\u201d<\/p>\n<p>The other option, though, would simply be to move a bit more slowly. Google\u2019s critics say the company could do a better job of considering privacy and developing safeguards <em>before<\/em> its business innovations create problems.<\/p>\n<p>\u201cThere\u2019s no doubt that there are some of the smartest minds in both privacy, data protection, law, and engineering inside these companies\u2014Google especially,\u201d says Jason Kint, CEO of the digital publishing trade organization Digital Content Next. (WIRED parent company Cond\u00e9 Nast is a member.) \u201cThey pride themselves on moonshots, they\u2019ve got just immense amounts of wealth and profitable business margin and growth. But they say, \u2018well, this is our business model and if we don\u2019t have this business model then we\u2019re going to have to charge for access.\u2019 It\u2019s just a very binary view. They have the ability to change the trajectory here, but they don\u2019t allow for any idea that things could be a bit different.\u201d<\/p>\n<p>Zunger, the former Google privacy engineer, points out that a big challenge for the company is that research and surveys consistently show that many people don\u2019t really understand their own privacy-related concerns, beyond vague awareness that some kind of danger exists. As a result, he says, people level criticisms against and requests of Google that aren\u2019t necessarily constructive or actionable in themselves.<\/p>\n<p>But Zunger notes an even more subtle reason that people working at Google may not see the same contradictions embedded in the company that some outsiders view as inherent.<\/p>\n<p>\u201cThere&#x27;s one aspect which is always going to be hard to address at a company like Google, which is when people have concerns that the mere existence of a single large pile of data is itself dangerous,\u201d Zunger says. \u201cPeople who feel this way generally aren&#x27;t going to come work at Google, and so this kind of concern is generally not represented very well. When Googlers address it, they do so by asking the more concrete question of \u2018OK, what risks could the existence of this data create?\u2019 They don\u2019t try to ask the meta-question of \u2018well, what if the data didn&#x27;t exist at all?\u2019\u201d<\/p>\n<p>In thinking about Google\u2019s extensive efforts to safeguard user privacy and the struggles it has faced in trying to do so, this question articulates a radical alternate paradigm\u2014one that Google seems unlikely to convene a summit over. What if the data didn\u2019t exist at all?<\/p>\n<p class=\"related-cne-video-component__dek\">Thanks to an assist from Congress, your cable company has the legal right to sell your web-browsing data without your consent. This is how to protect your data from preying eyes.<\/p>\n<p><a href=\"https:\/\/www.wired.com\/story\/google-privacy-data\" target=\"bwo\" >https:\/\/www.wired.com\/category\/security\/feed\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/5bca63da6d53d208ff9b4d12\/master\/pass\/FISK18_Wired_Eye_texture-still.jpg\"\/><\/p>\n<p><strong>Credit to Author: Lily Hay Newman| Date: Thu, 01 Nov 2018 14:11:06 +0000<\/strong><\/p>\n<p>Interviews with over a dozen current and former Google employees highlight a commitment to privacy\u2014and the inherent tensions that creates.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10378,10607],"tags":[714],"class_list":["post-13742","post","type-post","status-publish","format-standard","hentry","category-security","category-wired","tag-security"],"_links":{"self":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/13742","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=13742"}],"version-history":[{"count":0,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/13742\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=13742"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=13742"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=13742"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}