Mark Zuckerberg Makes Facebook Privacy Sound So Easy

Credit to Author: Brian Barrett| Date: Wed, 11 Apr 2018 01:10:58 +0000

Mark Zuckerberg appeared before Congress Tuesday, and for five hours, senators who appeared to have halting grasp of the company’s intricacies questioned the Facebook CEO on topics ranging from Russia to artificial intelligence. Zuckerberg for the most part gave considered answers to their questions—except when it came to the specifics of how users can control their privacy.

That Zuckerberg would dodge uncomfortable questions is a disappointment, though maybe no surprise. But when it came to addressing how the company collects and handles data—and what tools it gives you to control that flow of information—Zuckerberg landed repeatedly on a common refrain: Users have complete control over how their data gets used. “This is the most important principle for Facebook: Every piece of content that you share on Facebook, you own and you have complete control over who sees it, and how you share it, and you can remove it at any time,” said Zuckerberg.

But in trying to present this as exculpatory, Zuckerberg misses the point. Offering tools to someone doesn’t help at all if they’re hard to find, and even harder to understand.

Zuckerberg cited the “inline” controls that Facebook has gifted its users multiple times. What he’s referring to specifically seems to be the dropdown menu that you see before you post to Facebook, the one that says Who should see this?, and lets you whittle down your audience by friend groups, geography, or not at all.

Offering tools to someone doesn’t help at all if they’re hard to find, and even harder to understand.

Which, sure. That helps. But it’s also not what’s at issue here. The creeping concern around Facebook—and Google and other ad-driven platforms—isn’t whether former coworkers can see your current happy hour pics. It’s whether an infinite, invisible web of advertisers, marketers, and app developers can. There’s no inline control for that, no option before you post not to share your political screed or baby videos with Dove body wash, or some contact lens start-up.

For that, you need to dig deep into Facebook’s settings, a click-intensive process full of unclear language and uncertain paths. Here’s a story that walks you through it; it’s well over 2,000 words long. And that’s the condensed version.

"Based on today's hearing, even Facebook must acknowledge they need to do much more to communicate to users how their platform works, what data they collect on Facebook and off, and how that information is used for advertising," says Joe Jerome, policy analyst at the Center for Democracy & Technology. "Mr. Zuckerberg argued that Facebook needs to provide controls where users are, when they're posting photos and messaging friends, but global privacy controls have always been a challenge for Facebook—and any social media platform."

To be honest, all you really need to know about Facebook’s attitude toward what you share with apps and ad networks is that the social media company doesn't put controls for either under its Privacy category. For years, to see which developers might have your information, you’ve had to go to Apps and Websites. To see what advertisers know and see, you have to visit Ads, and decipher inscrutable language like “Can you see online interest-based ads from Facebook?”, which you’ll find under Ads based on your use of websites and apps.

Are these the controls that Zuckerberg thinks give users complete power over their data? It beggars belief, if so. They’re hidden, they’re opaque, and they don’t do enough to communicate what information, precisely, those third parties have about you and what they do with it. And if you have installer’s remorse and want to reclaim your data? Facebook can’t help you with that. You need to contact the developer directly, and hope they listen.

To make matters worse, even Facebook doesn’t necessarily know what happens to your data after an app accesses it. While questioning Zuckerberg, Senator Richard Blumenthal noted that the personality quiz app that exposed up to 87 million people’s data to political firm Cambridge Analytica clearly stated in its terms of service that the data it collected could be sold. If Facebook can’t bother to read all the way through an app’s Terms of Service, how can it expect you to read through its own? (And in fact, Zuckerberg acknowledged that most Facebook users likely don’t.) Until October 2015, Facebook even allowed apps to request access to user inboxes, which meant developers could read any message those people sent or received.

'They’re the company that created these problems. We can’t be looking to the company that caused these problems to fix them.'

Sam Lester, EPIC

“Going forward, we're going to take a more proactive position on this, and do much more regular stock checks and other reviews of apps, as well as increasing the amount of audits that we do,” Zuckerberg said Tuesday. But privacy advocates argue that any added rigor comes years too late, especially given that Facebook's looseness with user data in 2010 led to an FTC consent decree that placed strict requirements on how it handled your information. This is a company, after all, that for years left up a detailed privacy setting that didn’t do what it said. In fact, it barely did anything at all.

“Facebook already had a legal obligation to not misrepresent its privacy settings and to verify the security of all third party apps on its platform and submit audits,” says Sam Lester, consumer privacy fellow at the Electronic Privacy Information Center. “They utterly failed to comply with that order, which I think is clear from today’s hearings.”

Facebook did stop the pervasive sharing that allowed the Cambridge Analytica fiasco in 2015. And it will introduce a redesigned settings menu soon, one that at the very least puts everything in one place. The company this week proactively started pointing some users to the Apps and Websites setting that shows people what apps could sift through their info. All of that counts as progress. But privacy is not a solved problem for Facebook, especially given its (repeatedly failed previous attempts) at self-correction. Zuckerberg has spent the last 14 years apologizing for privacy slip-ups. There’s not much room for benefit of the doubt.

“For obvious reasons we’re not concerned with what Facebook’s fixes to this problem are,” says Lester. “They’re the company that created these problems. We can’t be looking to the company that caused these problems to fix them.”

If nothing else, the Cambridge Analytica has shown people what Facebook is and always has been: An alchemist that spins your data into gold. That’s not going to change. But the amount of transparency Facebook gives you around it still needs to—if only Mark Zuckerberg could see it.

https://www.wired.com/category/security/feed/