Credit to Author: Jonny Evans| Date: Fri, 31 Aug 2018 06:45:00 -0700
Apple recently told the U.S. Congress that is sees customer privacy as a “human right”, though the explanation didn’t at that time extend to how third-party developers treat data they get from iOS apps. Now it does.
The policy states that developers must:
This means Apple is insisting developers make verifiable and actionable promises about the data they gather, what they do with that information, and who they sell that data too.
This is a massively important step, and I predict that some apps – potentially including some relatively popular apps – may find themselves unable to make these promises, following legal advice.
(I’m not pointing any fingers, but any app developer whose primary business is selling people’s data to those vast unregulated data warehousing firms may find these steps make things a little trickier.)
In technology, the weakest security target is always the user, that’s why criminals try to target users with sophisticated and personalised scams.
The same is true of marketing – do you think billions would be spent on advertising if it didn’t make a difference? In the online age, marketing has become increasingly sophisticated and personalised, until now (as the scandalous behaviour of Cambridge Analytica and others of that ilk shows), those marketing techniques may now have crossed the line into criminal.
“We’ve never believed that these detailed profiles of people, that have incredibly deep personal information that is patched together from several sources, should exist,” Apple CEO TimCook recently said.
They can be “abused against our democracy,” he observed.
Apple has always insisted that it values its user privacy, but critics (often the same critics who have never concerned themselves regarding privacy when other firms have exploited it for profit) have pointed out that just because Apple is privacy-secure doesn’t mean those third-party developers are being so.
Those apps claiming your private data may lack Apple’s commitment to user privacy, they argue, before insisting Apple should do something about it.
Apple already collects much less data about customers than any other big tech firm, and that which it does collect tends to be heavily anonymized.
“The truth is, we could make a ton of money if we monetized our customer. … We’ve elected not to do that. We’re not going to traffic in your personal life. Privacy to us is a human right, a civil liberty,” Tim Cook has said.
Apple is now moving to extend those privacy protections a little, and has made several moves to try to prevent abuse of personal data for marketing and other forms of social engineering in recent months:
Apple’s new move to insist its developers also work to protect end user privacy may have profound consequences, as it implies that it will now monitor and pursue those developers who do not meet the commitments they make.
This will pose some inconvenience for companies whose business is built on gathering and exploiting user data, of course, but it also creates an even bigger divide between those platforms that do care about personal privacy, and those who don’t.
Who knows how this could impact future platform choice?
Google+? If you use social media and happen to be a Google+ user, why not join AppleHolic’s Kool Aid Corner community and get involved with the conversation as we pursue the spirit of the New Model Apple?
Got a story? Please drop me a line via Twitter and let me know. I’d like it if you chose to follow me on Twitter so I can let you know about new articles I publish and reports I find.