2009 is just like any year in at least one way; too many people buy into too many security myths far too easily.
To help prepare readers for the future, I'll share my thoughts on ten major security myths I have encountered in 2009. Each of these is chosen for its prevalence, its perniciousness, or its publicity this year. They may even have been chosen for other reasons that begin with P.
#1 Myth: Doing something right means you're doing nothing wrong.
Don't congratulate yourself too much for doing something right. You should try to avoid straining your shoulder patting yourself on the back every time you do something right with security. Securing your systems is never really finished, because (as I said in "When it comes to security, what does it mean to be good enough?"), security is not an absolute end result of following good practices, because there's always more that can be done to improve security. This is true even if, sometimes, "doing more" is a matter of researching whether or not something else needs to be done, and double checking to make sure there was not a better way to do what you have already done after all. Measure your successes to see how well they work in practice, consider what more you can do to ensure further success, and never rest on your laurels when it comes to security. In "Perfect vs. Good Enough," I said that security is "more a process than a goal, and more a journey than a destination; it is more a practice than a product." It is a point worth remembering.
Similarly, someone who does something right is not necessarily someone who can do no wrong. This applies as much to others as to yourself, and is worth keeping in mind as you select your security solutions. When Coverity announces that it will provide code audit support for open source projects, one should applaud such an effort -- but also ask, "What will you do tomorrow?" When Google offers RatProxy under an open source license, one should applaud such an effort -- but also ask, "Why don't you respect our privacy?"
#2 Myth: Anonymity and verification are mutually exclusive.
Many people, including many supposed security experts, have a difficult time wrapping their brains around the idea that one can both verify a transaction and allow involved parties to remain effectively anonymous. The two concepts are not so contradictory as one might think, however. One example of how to make anonymity and verification work together is via the OTR encryption library, which can allow two people to stay in verified contact with each other while failing to provide any later verification to third parties that any evidence of communication was not fabricated after the fact.
Another example is MIT's Scantegrity II, a vote tallying system that uses ballot serial numbers and vote codes to allow voters to verify that their votes were counted correctly without damaging the privacy -- the anonymity -- of their specific votes. Scantegrity II debuted in a Takoma Park, MD election and was by all accounts a great success.
#3 Myth: The GPL is needed to encourage project success.
Many people in the open source software (or "Free Software") community argue that the GPL is necessary to maximize the success of an open source project. They claim that without copyleft licensing to "force" people to release their source code to the world, many substantial improvements to software would be lost to the public and disappear into the world of closed source corporate anti-competitive practices. The truth is that copyleft licensing chains software to a set of disincentives to use -- and contribute code to -- the software that is just as problematic as the legal incentives of a license to contribute code is helpful, such as keeping some corporations away from copyleft code for fear it will "infect" the company's own copyrighted code, imposing onerous source code storage demands on distributors, and possibly leading to a bit of a "bait and switch" encounter with license interpretation.
Counterexamples include the dominance of Apache in Jack Wallen's "The most successful open source project ever "poll, the only copyfree licensed project in the poll; the fact that SQLite may be more widely deployed than all other SQL database engines put together; and the incredible success of OpenSSH, not only the most widely deployed SSH software in the world, but possibly the most widely deployed encrypted tunneling software, and also possibly the most secure of secure remote shell tools in the world. All three are distributed under copyfree terms: OpenSSH is distributed under the BSD license, SQLite is public domain software, and the Apache Web server is naturally distributed under the Apache license. As you may already know if you have been reading my articles, there is a strong argument for copyfree licensing when you want to develop secure software -- especially security software.
#4 Myth: Ubuntu Linux is the most secure operating system in the world.
The assertion that Linux is the most secure OS in the world arises from time to time, in different forms. The last couple of years or so have been interesting in that the current form of this claim seems to be specific to the Ubuntu distribution. It is quite de rigueur for people to adopt Ubuntu, divesting themselves of the "shackles" of Microsoft Windows, and within a month proclaim loudly to anyone who will listen that Ubuntu is the "most secure OS" in the world. This particular fallacy has been getting worse over time, and 2009 seems so far to be the home to the most strident such claims.
The truth is that Linux is not the most secure OS in the world, no matter what form that claim takes. In fact, one cannot reasonably make such a sweeping claim that any OS is the most secure in the world, because the relative security of a given OS depends on too many factors, including how it will be used and configured, the sorts of dangers to which it will be subjected, and the specific standards of security that are most appropriate to a given circumstance.
#5 Myth: Security vendors are the experts, and we should do what they say.
All too often, people cite press releases and white papers that serve as little more than thinly disguised marketing, all issued by security software vendors like Symantec and network hardware vendors like Cisco, as if the fact that they are (at least nominally) in the business of security is all the proof you need that they are right. The truth of the matter is that anyone can be fallible (yes, even your humble author); vendors have tremendous conflicts of interest that should cast some suspicion on what they say (such as Symantec's vested interest in Microsoft's continued policy of ignoring certain types of vulnerabilities when it patches security issues), and sometimes security theory in a given area is not nearly as settled an issue as someone might make it sound.
The truth is that we can never entirely substitute the judgment of a security "expert" for our own if we want to do the best job possible of securing systems for which we are responsible. Only we know our specific circumstances and needs; far-away CEOs of anti-malware software peddlers are hardly in a position to make any but the most broadly generalized statements about what constitutes good security practice, and they usually just parrot what someone else told them anyway.
#6 Myth: Encryption is the biggest online security problem.
Encryption is a very, very important issue for online security, but there is a "secret" problem that most security pundits don't address -- an even bigger problem than flawed or circumvented encryption: the guy at the other end of your encrypted connection. What good is strong encryption if the person (or corporation) at the other end of the connection sells data to "partner" vendors, posts it on Facebook, or even just has employees that are not 100% trustworthy?
Make sure you use encryption. The importance of being encrypted cannot be overstated, as long as you do not go so far as to claim it is always the most important thing, and I wish I knew a magical incantation that would finally make encryption popular. Still, any amount of encryption is nearly useless if you are using it to securely deliver your sensitive data into the hands of someone who will abuse it as much as any malicious security cracker eavesdropping on unencrypted communications. Don't forget about the guy at your end of the connection, either. If your computer is infected with rootkits and keyloggers, your encryption software may not do you any good at all. This is one reason I choose FreeBSD as my primary OS of choice, rather than MS Windows, MacOS X, or any other OS that doesn't trust the user.
#7 Myth: Fewer reported vulnerabilities means better security.
As I have pointed out before -- in "There's more to security than counting vulnerabilities" and "Vulnerability counting revisited," among other cases -- one piece of software having fewer publicly reported vulnerabilities than another does not necessarily mean it is more secure than the other. I am not the only person saying so, though those of us who are cognizant of this fact seem few and far between. For instance, in "Firefox, Adobe top buggiest-software list," Elinor Mills explained:
However, the high number of Firefox vulnerabilities doesn't necessarily mean the Web browser actually has the most bugs; it just means it has the most reported holes. Because the software is open source, all holes are publicly disclosed, whereas proprietary software makers, like Adobe and Microsoft, typically only publicly disclose holes that were found by researchers outside the company, and not ones discovered internally, Qualys Chief Technology Officer Wolfgang Kandek said late on Wednesday.
#8 Myth: Obscurity is responsibility.
No matter how many times people -- people like me, and people like Bruce Schneier, the Chuck Norris of security -- explain in great detail that obscurity is not security, people keep making the mistake of thinking that it is. One such case that is particularly prevalent in 2009 is "responsible disclosure." Microsoft has been trying to punish anyone who tries to warn users that its software has vulnerabilities for a long time, especially when such warnings come when Microsoft has ignored a known vulnerability for months or even years. This year, though, Microsoft's claims have taken on an additional air of legitimacy in the trade press, largely because when Microsoft said it was going to focus more on security, everybody started taking anything Microsoft said about security as divine revelation -- even if it was exactly the same thing Microsoft was saying the previous year.
The claim is that by talking about vulnerabilities, security researchers are exposing users to greater risk by letting malicious security crackers know about these security issues, and that we should let vendors work in secrecy until they are ready to deploy a fix for the issue.Of course, they seem to think that waiting months or even years is perfectly acceptable, exposing users to risk by simply not solving security problems nearly quickly enough to actually protect users.
If security researchers can be made to shut up about vulnerabilities they find, vendors will have the leisure to wait as long as they like, whereas letting customers know about them will put additional pressure on the vendors to fix the problem. In truth, such secrecy helps nobody but the vendor in the most -- if not all -- cases. After all, if the "good guys" found out about a vulnerability, the "bad guys" certainly can as well. Obscurity is no guarantee of our safety and, while something remains unpatched, those of us who have the ability to employ a temporary work-around (or just want to stop using a given piece of software because of its vulnerabilities) should have the chance, and the requisite warning, to do so.
Meanwhile, Microsoft Windows' fastest documented time between a vulnerability report and a patch deployment was longer than a typical time for the Linux kernel.
# 9 Myth: Microsoft is the year's security MVP.
Speaking of the recent trade press love affair with Microsoft security, the rate of growth for pundits and network administrators making statements to the effect that Microsoft has, to quote at least half a dozen such cases directly, "gone from a security laughingstock to a security trendsetter," is only exceeded by the continuing rate of growth for Microsoft vulnerabilities and security incidents. While it may be true that Microsoft has set security trends in 2009 -- by being central to the Conficker worm epidemic and leading the charge to punish researchers for "irresponsible" vulnerability disclosure (meaning: disclosure to customers that they are in danger when Microsoft is far too slow to move on a vulnerability) -- it is not so much in a positive manner as such statements suggest.
It may only take a few press releases from Microsoft's public relations people to convince most of the trade press and CEOs of the nation's technology-dependent corporations that Microsoft has suddenly become the vendor of the world's most secure software. Microsoft may even be greatly improving internal secure development policy. Microsoft still clearly has a very long way to go, though, and it takes a lot more than press releases to convince many of the rest of us, including security experts for whom "security" means more than blindly following the advice of marketing experts and reading about "security features" in nicely bullet-pointed full-page advertisements in Business Week.
#10 Myth: Santa will come down your chimney to deliver presents.
The sad truth is that chimneys that can admit even an undernourished eight year old child into one's home are almost nonexistent in modern homes. Changes in the design of the modern fireplace and chimney from decades ago ensure that even snow cannot easily find its way inside, let alone a jolly old elf in a red suit carrying a sack large enough to contain the world's Christmas gift supply. The problem of fitting through a chimney's narrow confines begins with the design of the chimney top, but continues through the entire structure, designed to keep inclement weather and small animals from intruding on the fireplace.
Despite this fact, I simply cannot condone leaving doors unlocked, windows open, or keys under doormats to let the man into your home. We do not live in Mayberry any longer, and security demands that we lock up our homes so that the malicious outsiders of the world will not find their way in. Smart parents may arrange a secure key exchange to ensure that Santa Claus has what he needs to get inside and distribute gifts to good little boys and girls. On the other hand, if the children have been bad, parents who fail to plan ahead may spare the children the disappointment of finding coal in their stockings.Whatever holiday you celebrate at this time of year, I hope you all have a good one -- good, safe, and secure.