Trying to dominate development of a certain type of software by preventing others from benefiting from your work is a great way to ensure your competitors advance the state of the art more than you do.
Perhaps the most obvious example of this state of affairs is the case of U.S. cryptographic technology export controls. During much of the 1990s, exporting strong cryptographic technologies ("strong crypto") from the U.S. was — on paper, at least — in some respects even more tightly controlled than export of actual military ordnance. Crypto export controls have been loosened a bit since then, but the laws on such matters are still not really ideal.
The intent of such legislation was pretty obvious. Since the second World War of the 20th Century, the United States government may well have been the leading developer of cryptographic technologies in the world, and when the civilian crypto research community finally took off it mostly centered on academia in the U.S. This being the case, it is almost unavoidable to come to the conclusion that Congress' intent in passing strong crypto export restriction laws must have been to keep the US government ahead of the rest of the world's governments, and its civilian research community ahead of the rest of the world's research communities (in part because that's where U.S. government cryptographers often get their start), by keeping people in other countries from getting their hands on U.S. crypto technology.
Unfortunately for the micromanagers in government — and for those of us in the U.S. who have an interest in security software research and development — the actual result of such policy was to force much of the most interesting security work out of the country. It was during that Dark Age of U.S. information security policy that the forking of the OpenBSD project from NetBSD actually required printing out reams of source code to hardcopy and transporting it across national borders to legally route around export controls. The code then had to be laboriously hand-entered into computers in Canada so the project founder could run the project from somewhere that it was legal to develop and maintain an open source security focused project online with strong crypto support.
Many security researchers who would otherwise have come to the U.S. (or stayed in the U.S.) to study and work started going elsewhere, notably places like Canada and the UK. Innovations in information security technologies started shifting away from the US as a consequence, and the end result may well have been that the rest of the world started catching up with US encryption expertise much more quickly than it would otherwise have done. Many of the best researchers specifically preferred to work outside the US, because importing strong crypto to the US was perfectly legal, but developing it in the U.S. and sharing it with someone overseas was much more legally problematic.
As I already indicated, the law has gotten a bit better about export of strong crypto in the United States, but it still isn't ideal. As if to make up for the relaxed laws about cryptographic technology export, however, the US government is now much more strict about granting student visas for people who want to study information security related subjects. Whereas before it was pretty much entirely an unintended effect of trying to keep technology from leaving the country, now it seems as though the government is directly disallowing meaningful growth of domestic information security research. Just as things get better for those of us interested in information security research and development who are already here, the U.S. government invents a new way to screw itself out of local expertise.
Counter-examples in government are not as dramatic. Canada's relatively permissive laws attract people who would otherwise have come to the U.S. to work on security software, of course, but that attraction is more a side effect of nearby U.S. policy than of uniquely Canadian conditions that are in and of themselves attractive to security researchers. Better counter-examples are more easily found in business than in government.
My choice for the example of the day is Google. As indicated not only by my articles of recent months about Google software development — Google opens up RatProxy, Keyczar: another open source security tool from Google, and What are the security implications for Google Chrome — but also by posts in the Google Online Security Blog, the company's security strategy seems to be to attract development expertise in part by making useful, high quality security software as widely available as possible. In fact, in contrast with many other companies (some of whom probably use the GPL more because of its popularity than any meaningful analysis of the comparative benefits of open source licenses), Google releases such tools under the auspices of copyfree licenses, which make it even easier for pretty much anyone to use the software. That, in turn, encourages better security awareness and advancement worldwide than would otherwise be likely, as I pointed out in Choose the right licensing model for security software.
More to the point, it enlists willing contributors to the cause of improving Google's software, and serves as excellent marketing amongst the security aware for Google itself. That marketing not only helps convince people to use Google's services, thus expanding the customer base, but also to compete against one another for employment at Google. The end result is that giving away new in-house security technology leads to increasing the quality of security technology development resources at Google's disposal.
I suppose it should be no surprise that:
- Congress, mostly made up of people who don't even understand what the Internet actually is, fails so utterly at understanding the way their actions lead to unintended consequences.
- Google, a corporation whose incredible growth was predicated upon harnessing certain emergent properties of a complex system (the Internet), would prove to be a capable cultivator of the emergent properties of complex systems (such as the software developer employment market).
In the end, there is a single lesson to be drawn from both the above examples of policies related to sharing security technology. The lesson is this:
Open development attracts more development talent — which leads to faster development. Closed development just drives the development talent elsewhere.
Closed development is akin to a locked room. You may keep the security technology you possess locked away, but while you're protecting it, the rest of the world — which outnumbers your people at least hundreds of millions to one — will be busily engaged in surpassing your protected technology. Locking the room doesn't only keep your technology development locked in, but keeps the rest of the world's technology development locked out.
Put another way, a closed fist doesn't hold water anywhere near as well as cupped hands.
Worried about security issues? Who isn't? Delivered each Tuesday, TechRepublic's IT Security newsletter gives you the hands-on advice you need for locking down your systems and making sure they stay that way. Automatically sign up today!
Chad Perrin is an IT consultant, developer, and freelance professional writer. He holds both Microsoft and CompTIA certifications and is a graduate of two IT industry trade schools.