With latest mobile security hole, could we at least focus on the right things?

Credit to Author: Evan Schuman| Date: Wed, 13 Feb 2019 03:00:00 -0800

A bunch of apps from some major players — including Expedia, Hollister, Air Canada, Abercrombie & Fitch, Hotels.com and Singapore Airlines — recently came to grief because of a security/privacy hole in a third-party analytics app they all used, according to a report from TechCrunch. The incident exposed extremely sensitive customer information including payment card and password data shared in clear text. That sort of thing shouldn’t be happening — and yet everyone seems focused on the wrong lesson.

The analytics app, called Glassbox, captures all information from a user’s interaction with the app, including keystrokes entered and spots on the touchscreen the user touched or clicked. It also may include some screen captures. In every case, the apps give insufficient privacy disclosures to app users, or none at all. And, as already mentioned, it shares sensitive data in clear text.

Of these two issues, which do you think Apple jumped on? If you said, “Recklessly sharing passwords and payment card data,” you haven’t been paying attention.

“Protecting user privacy is paramount in the Apple ecosystem,” Apple said in a statement. “Our App Store Review Guidelines require that apps request explicit user consent and provide a clear visual indication when recording, logging, or otherwise making a record of user activity. We have notified the developers that are in violation of these strict privacy terms and guidelines, and will take immediate action if necessary.”

And in a letter that Apple sent to developers — intercepted by TechCrunch — Apple wrote, “Your app uses analytics software to collect and send user or device data to a third party without the user’s consent. Apps must request explicit user consent and provide a clear visual indication when recording, logging, or otherwise making a record of user activity.” Apple gave the developer less than a day to remove the code and resubmit the app, and if it didn’t meet that deadline, the app would be removed from the App Store, the email said, according to the TechCrunch story.

What about the clear-text massive security hole? Isn’t Apple just a wee bit concerned about that?

By the way, this is hardly a new problem. A little more than five years ago, this column reported that Starbucks had almost the identical problem — sharing passwords in clear text — courtesy of its own third-party app, an operation called Crashlytics, which captured data when an app crashed so it could later identify the cause of the crash.

Here are some of the problems with the privacy-disclosure issue. Of course apps should disclose all of this. There’s no argument there. But consider two facts. One, users are notorious for not reading privacy policies. Companies will put almost anything in those policies, knowing that the best place to hide a secret is in a privacy policy.

Second, unlike like the Starbucks Crashlytics incident from a half-decade ago, it does not appear that Glassbox captured everything the user was doing while the germane app was active. It only captured direct interactions with the app. In other words, if I had the Air Canada app launched and then veered away briefly to perform a Google search or respond to a text, Glassbox — as far as I can tell — wouldn’t grab that Google search or that text. It only captured direct interactions with the app, which is something anyway.

My point is that I am puzzled with which secret interactions (beyond the password, bank data and payment card specifics in plain text, which I very much do have a problem with) are being exposed that users don’t already assume the app’s owners know. For example, if I use the Amazon app and do lots of searches about various products, I am going to assume that Starbucks knows every single thing I do with its app. Don’t you? Do you really think you can use the Google iPhone app or the Spotify Android app and have those interactions not known by those companies?

The privacy policy issues I care about — consider Uber’s privacy “investigation” — are ones where the company is doing something that its customers would not typically suspect.

Paul Bischoff, privacy advocate with Comparitech.com, argued that “the data collected and sent to the app developers might not be properly secured. If the app developers do not take measures to properly mask sensitive information in their apps, then unencrypted screenshots containing passwords and credit card information could be accessed or intercepted by attackers. I think it’s ultimately up to Apple to solve this problem. Apple should better vet the apps that use session replay services to ensure they’re secure and that they obtain opt-in consent, or session replay services should be banned from the App Store altogether. It’s worth mentioning that many apps and websites use A/B testing to figure out what users are clicking on, but this data is usually aggregated and can’t be connected to an individual, and they don’t take screenshots. So alternatives do exist.”

Bischoff makes a good point, but I’m not confident that those specific suggestions would help.

First, it’s hardly viable for Apple or Google to aggressively police all of the apps — and each and every app update — for their platforms. I hate to agree with Apple or Google on a privacy or security matter, but it makes far more sense for Apple and Google to set policies and requirements and then let each app maker police its own app.

What the heck kind of pen testing could Expedia or Abercrombie & Fitch have done that they somehow missed highly sensitive data moving to its servers in clear text? The problem is that these large companies simply trust well-regarded third-party apps far too much. Please, people: Test everything that touches your app. You can’t rely on Google or Apple to backstop you.

http://www.computerworld.com/category/security/index.rss