Why encryption that doesn't trust the user isn't trustworthy

In the words of Wikipedia's article on pseudorandomness at the time of this writing, "a pseudorandom process is a process that appears to be random but is not." In programming, the term "pseudorandom" is most often used to refer to number generators that produce numbers with the statistical appearance of randomness (that is, the chance of any given number being generated is roughly the same as if the number generation were truly random), but an algorithm is being used to generate the numbers such that numbers can be duplicated by someone else using the same algorithm (and the same parameters).

Since numbers of varying degrees of randomness are so important to certain types of security procedures, most notably those involving encryption, the ability of some unauthorized individual to duplicate your number generator's results could constitute a severe security vulnerability. If your network's security relies on encryption — and many do, especially if they're wireless networks — a poor pseudorandom number generator behind the encryption scheme is something that might keep you up at night, and with good reason.

This doesn't mean you should only use systems that employ "true" random number generators. The only reason encryption systems tend to use pseudorandom number generators is that, at the current level of technology, there are no "true" software-based random number generators. You just need a generator with a sufficiently high period (the number of generated numbers that occur before the pattern repeats itself) and the introduction of some factor to the generation of pseudorandom numbers that actually is random.

In most cases, this random input is used to generate a "seed". That seed is used as the start point for generating a pseudorandom number, and it becomes significantly difficult to duplicate the operation of the pseudorandom number generator without having the same seed — effectively impossible, in fact. The importance of your number generator's period is then to keep the pseudorandom numbers generated from repeating themselves quickly enough that a brute force attack will work. Encryption systems have been developed over the years that actually rely on sharing the seed between parties or nodes that need to be able to encrypt and decrypt each others' missives.

Other encryption systems have been developed that had so short a period they were worse than useless — because "useless" would be something that just didn't work, but "worse than useless" is something that makes you feel secure, so that within its "protection" you say things you wouldn't say in public, but someone else is listening in without your knowledge. An example of such a system was the World War II Japanese cipher codenamed Purple by the United States cryptanalysts who repeatedly cracked the encryption keys used by that system.

There are other matters to keep in mind when developing a pseudorandom number generator algorithm (PRNG), too. For instance, the difficulty of determining previous generated numbers when all you have cracked is the current state of the PRNG should be quite high, at least in cases where the number generator is used as part of an encryption system. In systems where the number generator algorithm does not make it sufficiently difficult, a malicious security cracker can collect encrypted messages, such as your credit card numbers sent over SSL connections to e-commerce Websites, and work on determining the current state of the PRNG. Once the security cracker has that information, he might then attempt to trace previously generated numbers from that point, which would lead to the ability to recreate encryption keys used on those stored encrypted messages, and decipher them at leisure.

Some encryption systems protect you against that eventuality, where cracking the current key does not substantively help you crack past or future keys. This protection is called by some cryptographers "perfect forward secrecy". One example of an encryption system that provides perfect forward secrecy is the OTR Messaging library, used in encryption plugins for applications like the Pidgin multiprotocol IM client (the core functionality of which is coincidentally provided by a library called "libpurple").

Some encryption systems do not provide perfect forward secrecy. In fact, some systems that people have been trusting with their private communications and data for years provide quite deeply flawed forward secrecy. One example of an encryption system that utterly fails in the perfect forward secrecy arena is the Microsoft Windows PRNG, which was reverse engineered by Israeli researchers who demonstrated how easy it is to determine past and future encryption keys produced by the MS Windows integrated encryption systems if you know the current state of the PRNG.

Encryption is one of those technology areas where anyone that knows anything about the field these days knows that there is no such thing as real security through obscurity. Trying to protect your security by hiding your procedures is a losing proposition, because the technology is easily reverse engineered to determine what those procedures are. Thus, your best bet for developing and maintaining secure encryption systems is security through visibility: develop a system that does not depend on the inner workings of the system itself remaining secret to work properly. All it takes is a single leak, or a single case of someone successfully reverse engineering the procedures embodied in your algorithms, to destroy your entire illusion of secrecy.

It is in part for such reasons that open source encryption systems like OTR and OpenSSH have remained useful, strong protection for your privacy. When you open up your process to peer review, it can be fixed and improved based on the feedback you receive. When you do not, the only people who will be reviewing it are the people who want to circumvent the protections you have put in place — and it is not in their best interest to give you feedback at all.

Unfortunately for organizations whose software development and business model are based on secrecy for their procedures and processes, you have to trust your user base with those procedures and processes to get the kind of feedback you need to improve them. In matters of security, as in other matters, trust is a two-way street. This is not a new principle of encryption security. Auguste Kerckhoffs elaborated on this principle in the 19th century, with what came to be known as Kerckhoffs' Principle — that a cryptosystem should be secure even if everything about the system, except the key, is public knowledge. It was later reformulated by Claude Shannon, the father of information theory, to produce what is generally known as Shannon's Maxim, a rather more alarming statement that has (almost) exactly the same implications:

The enemy knows the system.

With that assumption in mind, you can dispense with the bread and circuses of "security through obscurity" and focus on the matter of trust. Would you rather trust your friends and customers, who are inclined to help you keep your encryption system secure, or would you rather keep them in the dark without any reasonable expectation that the "enemy" will be likewise hampered?

I know which group I would rather have reviewing my security technologies. Which would you prefer?


Chad Perrin is an IT consultant, developer, and freelance professional writer. He holds both Microsoft and CompTIA certifications and is a graduate of two IT industry trade schools.

Editor's Picks