Going dark: encryption and law enforcement

Credit to Author: William Tsing| Date: Tue, 25 Jul 2017 15:00:18 +0000

We’re hearing it a lot lately: encryption is an insurmountable roadblock between law enforcement and keeping us safe. They can’t gather intelligence on terrorists because they use encryption. They can’t convict criminals because they won’t hand over encryption keys. They can’t stop bad things from happening because bad guys won’t unlock their phones. Therefore—strictly to keep us safe—the tech industry must provide them with means to weaken, circumvent, or otherwise subvert encryption, all for the public good. No “backdoors”, mind you; they simply want a way for encryption to work for good people, but not bad. This is dangerous nonsense, for a lot of reasons.

1. It’s technically incorrect

Encryption sustains its value by providing an end to end protection of data, as well as what we call “data at rest.” Governments have asked for both means of observing data in transit, as well as retrieving data at rest on devices of interest. They also insist that they have no interest in weakening encryption as a whole, but just in retrieving the information they need for an investigation. From a technical perspective, this is contradictory gibberish. An encryption algorithm either encodes sensitive data or it doesn’t—the only method for allowing a third-party to gain access to plain-text data would be to either provide them with the private keys of the communicants in question or maintain an exploitable flaw in the algorithm that a third-party could take advantage of. Despite government protestations to the contrary, this makes intuitive sense: how could you possibly generate encryption secure against one party (hackers) but not another (government)? Algorithms cannot discern good intentions, so they must be secure against everyone.

2. They have a myriad of other options to get what they need

Let’s assume for a moment that a government entity has a reasonable suspicion that a crime has been committed, a reasonable certainty that a certain person did it, and a reasonable suspicion that evidence leading to a conviction lies on an encrypted device. Historically, government entities have not checked all these boxes before attempting to subvert decryption, but let’s give them the benefit of the doubt for the moment. Options available to various levels of law enforcement and/or intelligence include, but are not limited to:

  • Eavesdropping on unencrypted or misconfigured comms of a suspect’s contact
  • Collecting unencrypted metadata to characterize the encrypted data
  • Detaining the suspect indefinitely until they “voluntarily” decrypt the device
  • Geolocation to place the suspect in proximity to the crime
  • Link analysis to place the suspect in social contact with confirmed criminals
  • Grabbing unencrypted data at rest from compliant third party providers
  • Eavesdropping on other channels where the suspect describes the encrypted data
  • Wrench decryption

Given the panoply of tools available to the authorities, why would they need to start an investigation by breaking the one tool available to the average user that keeps their data safe from hackers?

3. They’re not really “going dark”

In 1993, a cryptographic device called the “clipper chip” was proposed by the government to encrypt data while holding private keys in a “key escrow” controlled by law enforcement. Rather than breaking the encryption, law enforcement would have simply had a decryption key available. For everyone. An academic analysis of why this was a stunningly bad idea can be found here.

Given that this program was shuttered in response to an overwhelmingly negative public opinion, has law enforcement and intelligence agencies been unable to collect data for the past 24 years? Or have they turned to other investigatory tools available to them as appropriate?

4. If we do give them a backdoor, what would they do with it?

1984-style heavy handed tactics are unlikely at present time, but a government breach that results in loss of control of the backdoor? Much more likely. The breach at OPM most likely endangered the information of up to a third of adult Americans, depending on who and how you count. (We don’t know for sure because the government didn’t say how they counted.) That breach involved data of sensitive, valuable, government employees. Would they do any better with a backdoor that impacts technology used by pretty much everyone?

No, they wouldn’t.

Let’s take a look at how they secure their own networks, post OPM. Oh dear….

If the most powerful and richest government in the world cannot secure their own classified data, why should we trust them with ours? The former head of the FBI once called for an “adult conversation” on encryption. We agree. So here’s a modest counter-proposal:

  • Stop over-classifying cyberthreat intelligence. The security community cannot fix what it does not know. Threat intelligence over a year old is effectively worthless.
  • Send subject matter experts to participate in ISACs, not “liaisons.”
  • Collaborate in the ISACs in good faith: shared intelligence should have context and collaboration should extend beyond lists of IOCs.
  • Exchange analytic tradecraft: analysts in the government often use techniques that while obscure, are not classified. This will improve tradecraft on both sides.
  • Meet the DHS standard for securing your own machines, classified or otherwise. No one would trust someone with a key escrow if those keys are held in a leaky colander.

We think these are reasonable requests that can help keep people safe, without breaking the encryption the world relies on daily to do business, conduct private conversations, and on occasion, express thoughts without fear of reprisal. We hope you agree.

The post Going dark: encryption and law enforcement appeared first on Malwarebytes Labs.

https://blog.malwarebytes.com/feed/