[09-12-2013 There is an update to this article. See below.]
To set the tone, here’s the Guardian describing what intelligence agencies are doing to overcome their biggest hindrance, “The use of ubiquitous encryption across the internet.”
[M]ethods include covert measures to ensure NSA control over setting of international encryption standards, the use of supercomputers to break encryption with “brute force”, and — the most closely guarded secret of all — collaboration with technology companies and Internet service providers themselves.
Through these covert partnerships, the agencies have inserted secret vulnerabilities — known as backdoors or trapdoors — into commercial encryption software.
It’s like a Sherlock Holmes mystery, with each new release of intelligence-agency documents providing another clue as to how intensely citizens are being surveilled. We even have our modern-day digital detectives, who help interpret these clues.
With that, I’d like to introduce our first sleuth, Poul-Henning Kamp, a Unix guru who is synonymous with the FreeBSD project. Poul-Henning wrote an eye-opening article for ACM titled, “More Encryption is not the Solution.” Poul-Henning shakes things up right away by offering the following prediction:
“The recent exposure of the dragnet-style surveillance of Internet traffic has provoked a number of responses that are variations of the general formula, ‘More encryption is the solution.’ This is not the case. In fact, more encryption will probably only make the privacy crisis worse than it already is.”
Poul-Henning then offers three “Inconvenient Facts about Privacy” as explanation why encryption does not ensure privacy:
- Inconvenient fact number one: Politics trumps cryptography. Nation-states offer their citizens a choice, unlock encrypted files or go to jail.
- Inconvenient fact number two: Not everybody has the right to privacy. For example, in most nation-states: prisoners are only allowed private communications with their attorneys; employees give up large chunks of privacy as part of their employment agreement; and finally, most citizens are now witnessing the loss of their privacy through judicial oversight.
- Inconvenient fact number three: Encryption will be broken if need be. If a nation-state determines that someone should not have any privacy, it will do everything possible to make it so.
When I started this article, I intended to devote the entire piece to Poul-Henning’s ACM paper and how he builds a case for his “Inconvenient Facts.” That all changed two days ago, when the Guardian released new documents proving Poul-Henning correct.
My reporter curiosity had me wondering, so I asked Poul-Henning if he knew about these particular documents before they were made public: “No, I simply looked at the plausible NSA budget (that was also before the “black budget” was released) and thought about how I would use the money if I were in charge of NSA.”
As you will see in a bit, Poul-Henning was scary accurate.
This brings us to our next digital detective: fellow Minnesotan, author, and world-renowned security expert, Bruce Schneier. It was Bruce’s article, “NSA surveillance: A guide to staying secure,” that alerted me to the latest document release by the Guardian.
Bruce starts out by mentioning he’s been working with the people at the Guardian for several weeks now, sifting through hundreds of agency documents. This gave Bruce valuable insight into what intelligence agencies have managed to assemble:
“The primary way the NSA eavesdrops on Internet communications is in the network. That’s where their capabilities best scale. They have invested in enormous programs to automatically collect and analyze network traffic.”
Each time I read the papers and the Guardian articles, I come to the same conclusion, intelligence agencies have the ability to compromise everything digital. Bruce offers his blunt assessment:
“These are hacker tools designed by hackers with an essentially unlimited budget. What I took away from reading the Snowden documents was that if the NSA wants in to your computer, it’s in. Period.”
The Guardian, Poul-Henning, and Bruce all mention that major encryption processes are compromised, but I didn’t understand how intelligence agencies could subvert something like HTTPS. Poul-Henning explains one way:
With expenditures of this scale, there are a whole host of things one could buy to weaken encryption. I would contact providers of popular cloud and ‘whatever-as-service’ providers, and make them an offer they couldn’t refuse: on all HTTPS connections out of the country, the symmetric key cannot be random; it must come from a dictionary of 100 million random-looking keys that I provide. The key from the other side? Slip that in there somewhere, and I can find it (encrypted in a Set-Cookie header?).
If I understand, this means the process itself is not flawed. The key randomness is reduced, allowing those with powerful processing capabilities to easily crunch through the possible keys. Bruce verified Poul-Henning:
Basically, the NSA asks companies to subtly change their products in undetectable ways: making the random number generator less random, leaking the key somehow, adding a common exponent to a public-key exchange protocol, and so on.
If you remember, Bruce’s article was titled “A guide to staying secure.” So, Bruce must have some options for us:
1. Hide in the network: Whenever possible use services like Tor; doing so increases the surveillance effort markedly.
2. Encrypt your communications: It’s true, intelligence agencies target encrypted traffic, but any encryption is still better than sending traffic in the clear.
3. Assume your computer can be compromised: This is the tough one. Bruce suggests we create files and encrypt them on a computer that has never been attached to the Internet. Then using a flash drive, transfer the encrypted files to an Internet-facing computer for delivery. Decryption would be the exact opposite.
4. Be suspicious of commercial encryption software especially from large vendors: The secret agreements between intelligence agencies and technology companies extends to those developing security and encryption software. We should assume that every commercial application has an NSA-friendly back door.
5. Try to use public-domain encryption that has to be compatible with other implementations, which means:
- Do not use proprietary software, back doors are easier to hide in proprietary software.
- Use encryption applications employing symmetric cryptography instead of public-key cryptography.
- Use encryption applications that are conventional discrete-log based, not elliptic-curve systems.
The advice I’m getting from Bruce and other experts is to make decoding our Internet traffic as difficult as possible. That way targeting us will not be worth the time and effort. Bruce concludes his article by saying:
Trust the math. Encryption is your friend. Use it well, and do your best to ensure that nothing can compromise it. That’s how you can remain secure even in the face of the NSA.
The hard part will be figuring out what encryption process has not been compromised.
[Update: 12 Sep 2013]
Back in 2007, Bruce Schneier in this blog post raised a flag that NIST Special Publication 800-90, a
document detailing a new random number generator being added to an NIST
encryption standard was suspicious. Here is what Bruce said:
in the standard only because it’s been championed by the NSA, which
first proposed it years ago in a related standardization project at the
American National Standards Institute.
has always been intimately involved in U.S. cryptography standards — it
is, after all, expert in making and breaking secret codes. So the
agency’s participation in the NIST standard is not sinister in itself.
It’s only when you look under the hood at the NSA’s contribution that
The NIST has re-opened the public review period of this standard. It seems we need to listen to our security experts.
The ethics and legality of what intelligence agencies are doing is debatable. What concerns me even more is if — more likely, when — the bad guys figure this stuff out. They’re not going to spend time debating ethics; they’ll use the built-in weaknesses for their purposes without a second thought.
Could it be that obscurity is the new security?