Security

What my grandmother taught me about IT security


The Identity Theft Enforcement and Restitution Act of 2007 passed the Senate by unanimous consent. As is often the case in our nation's legislature, the two houses of the federal legislature -- the House of Representatives and the Senate -- are working at roughly redundant purposes, and have each worked on very similar bills. The House version, however, has not yet left subcommittee deliberation for consideration by the House of Representatives at large.

The Senate bill, should it be enacted as law, amends Title 18 of the US Code to address conspiracy to commit what our Congress terms "cybercrime", close loopholes in current law against extortion, give victims of identity theft increased ability to seek restitution, and specifically address the phenomenon of botnets. Dealing with botnets is attempted in the ITERAct by making it a crime to "damage", whatever that means, ten or more computers in a single year.

Tim Bennett, the president of the CSIA, said "This cybercrime bill is an integral part of the cybercrime fight, but it is also imperative that this Congress address through legislation other aspects of the problem, such as data security, to prevent criminals from getting sensitive personal information in the first place."

Security industry vendors don't seem terribly optimistic about the prospects of such a bill passing the House of Representatives before the end of the year, however, considering the way most of the House's time has been diverted by matters related to the war in Iraq and "homeland security". Add to that the return of deliberations over fiscal year budgeting, and it's no wonder Symantec's federal government relations manager Kevin Richards said "prospects for this year don't look so good".

If such a data protection bill passes Congress, its phrasing will bear watching. It could easily go one of several ways. The best possible outcome, in my estimation, would be a true digital privacy bill that reinforces the implications of the Fourth and Fifth Amendments of the US Constitution. Assuming it was more than a lame duck law, such an act would to a significant degree protect against the type of abuses of power we've seen in wiretap scandals of recent years, USA PATRIOT Act provisions, and the potential for NSA-designed backdoors in common encryption standards such as the speculated intentional weakness in the Dual_EC_DRBG NIST encryption standard.

While the above-linked Wired article by Bruce Schneier is certainly worth the read, I'll summarize a bit for you:

  1. NIST released a new official standard for random number generation software used in encryption algorithms, called NIST Special Publication 800-90 [PDF].
  2. That standard defines a set of four DRBGs approved for government use and recommended for widespread public use.
  3. The NSA championed the elliptical curve based generator, Dual_EC_DRBG, for inclusion in the NIST standards.
  4. Dual_EC_DRBG is slower than pond scum running uphill and contains a small, but measurable, numerical bias -- problems none of the other new NIST standard DRBGs share, which makes one wonder why the NSA bothered to push for its inclusion.
  5. Dual_EC_DRBG contains a mathematical "back door", one that may or may not have been intentional and for which the NSA may or may not have the key. Reverse-engineering the key should be a significantly difficult task, perhaps effectively impossible at current technology levels, but it could very easily have been generated at the time of creation of the constants used to define the algorithm's elliptic curve. For more information on what that means, I recommend some heavy Googling -- it's a subject well beyond the scope of this article.

The backdoor problem was brought to light by Dan Shumow and Niels Ferguson at the CRYPTO 2007 conference. Thank the diligent cryptographers of the world for helping keep you safe, in part by pointing out the flaws in government recommended encryption systems before you run afoul of them. If we're very, very lucky, we may soon be able to thank Congress for passing a data security law that might make the NSA step a little more carefully when it comes to trying to maintain backdoors into our private communications.

Even if you believe that everyone in the NSA that would have access to the keys to the kingdom will be trustworthy, and will not harm you or your business with such tools at their disposal, allowing the US government to maintain backdoors in your encryption software is a monumentally bad idea. Consider that in the last few years ostensibly security-conscious government agencies have lost laptops with personally identifying information stored on them about large numbers of citizens, had their network security compromised by foreign governments, and issued improperly redacted PDFs so that the parts they did not intend to release were easily recovered from the files (hint: don't just paint black lines over text with Adobe Acrobat). All of this and more has been in the news since 2001.

The lesson is that even if you trust the NSA with the keys to the kingdom, you might want to think about who can get those keys from the NSA, and how common, everyday incompetence can help those keys fall into the wrong hands.

As my grandmother once said, "It's not you I don't trust with my secrets. It's the people you'd tell."

About

Chad Perrin is an IT consultant, developer, and freelance professional writer. He holds both Microsoft and CompTIA certifications and is a graduate of two IT industry trade schools.

13 comments
GeorgesR
GeorgesR

I have read the articles and something bothers me, the NSA is a big thing, they have all kinds of "brians" working for them. Now they push an encryption standard that was easily found by encryption experts. Now that almost everyone knows that then, most likely, nobody will use it. It is like that it is what they wanted in the first place. Maybe we should take a very closer look at the other standards.

sboverie
sboverie

I like what a comedian observered, "why do police station lockers have locks on them?" If the police can't trust other police, why should we the people trust them? The old question "Do the ends justify the means?" is valid. Is it good for us to hand our government a blank check to protect us even at the cost of our rights and liberty? Government should be the servant of the people, government is not a good master.

moody.don
moody.don

I found the name W. Dewey Clower on a memo from the Nixon Whitehouse released July 11, 2007. What blew me away was that Clower is a spook in the NSA, Langley. He shows up only in Republican administrations. In 1992 he became President of the National Truckstop Owners Association and consolidated all the Truck Stops in America. In 2002 he became the C.E.O. of Vonage/Wi Fi, receiving a grant from the Department of Homeland Security for border protection from the border, to Tucson Arizona. It seems to me this has been planned for quite some time.

dsimp
dsimp

I think it is not very secure if the security or encryption you use is (to some) not encrypted and therefore not secure. I think recently in Britain they have introduced a law preventing the use of such secure programs that do put data beyond the reach of 'big brother' (read: the government). How many & who would have these skeleton keys ? I have a friend who will only use an old version of PGP (command line only - before the gui was developed), because he thinks even that may have been compromised. - given a back door in other words. All this security is a bit much for me but the point is, if you want something to be secure - you dont build any backdoors. refuses to even use a version of PGP that

apotheon
apotheon

"Government should be the servant of the people, government is not a good master." -- you "Government is not reason; it is not eloquent; it is force. Like fire, it is a dangerous servant and a fearful master." -- George Washington I see some similarities there.

Penguin_me
Penguin_me

Just to clarify - the British government don't have a law preventing the use of encryption programs. They *did* however pass a law stating that in a criminal investigation a person must hand over the encryption keys and passwords if asked to, and if they refuse they can be fined and spend up to 2 years in prison. Of course, if you can prove that you don't have / never had the keys then you're safe, but you have to prove it.

dsimp
dsimp

I didnt delete it before I hit send. :( sorry

Michael Kassner
Michael Kassner

Great post Chad, thank you. I was curious as to how this will affect applications like TrueCrypt and others? I think Mr. Schneier has recommended TrueCrypt at one time. Does this mean the developers will have to revise applications to meet the new standard?

apotheon
apotheon

"[i]Of course, if you can prove that you don't have / never had the keys then you're safe, but you have to prove it.[/i]" How exactly does one prove on does [b]not[/b] have something -- especially when that "something" is infinitely duplicable?

bikingbill
bikingbill

The British Government's track record on security is much worse than the few missing laptops mentioned in the article. Half of the nation's bank and National Insurance records anyone? I wouldn't trust any Government to look after data security. It's not politically interesting until it goes wrong. Governments should limit themselves to requiring "reasonable care" to be taken, and leave the technical stuff needed to achieve this to people who understand it.

jdiswonderful
jdiswonderful

Which is hardly likely, as you point out, however, let's not overlook the notion that ones' freedom there seems to be potentially held captive to their individual capacity to dismiss their own privacy claims. It's hardly privacy if the government can restrict your freedom, should you insist on it, and of course, and with at least some degree of irony, place you in a position where they can scrutinize everything you do much more effectively.

ben@channells
ben@channells

Nearly all have occured where IT service has been outsourced or a platform has been outsourced. This usually cost us tax payers 30 to 50 %MORE while offering a 20% worse service. Having worked at many Gov Dept's with outsourcing inplace where cost saving take presidents over Gov requirements or the customer requests. I have hear of charging for Secure courier, Secure wiping and NOT actualy performing the actions. For thoes with long memory Win95, 98 and NT4 had an NSA key as a back door hence some Gov dept used Security Enhanced versions. Commercial DES is based on work done at GCHQ. a non commercial encryption has been used in Gov dept's to Military or Battle field grade. The growing problem for the dept's that are secure, is the agencies or developers are being made redundant or retired. see QinetIQ. I know of one Gov section that has an internet facing database that survived 10 years of attacks without a single breach or data loss. How every with in 9 month of being outsourced and replaced was hacked wide open and later taken down and out of action as the server was unpatched Windows2000 and unpatched SQL2000. apparently the design and invoice was for Windows 2003 and SQL 2005 :-)

syost16
syost16

i definitly agree, as of right now, letting the government handle data security is basically the equivalent of letting a patient in a drug induced coma control his own drug therapies.