I remember when the Internet got started; my former college roommate worked for a little place called BBN, and actually, to be really honest, I'm old enough that I recall when DARPANET started. Back then, the "Net" was not only limited to academics and a few in the government, it was actually considered experimental in exactly the same way the Altair 8800 was when it made the cover of Popular Electronics in January 1975.
A lot of people today simply weren't around then. They just don't realize that today's Internet is little more than an academic experiment that grew so fast and so uncontrollably that those who rely on it today as the most mission critical of technologies for their business really don't understand why it keeps failing and why it will, by necessity, continue to fail to provide a secure work environment.
Take Microsoft Windows, for example. Sure, the code is pretty sloppy, and security has, until recently, been treated as an afterthought — but stop and think. Just how many of today's Windows vulnerabilities are related to the fact that Windows was invented when the Internet, and the World Wide Web in particular, were still being treated by many institutions as a grand experiment?
Is it any wonder that Windows is insecure? It was built on MS-DOS, which mainly faced the security challenge of avoiding attacks that came on 5¼-inch floppy disks. But Windows wasn't designed to withstand this sort of constant attack — is it any wonder that it fails to defeat every attacker?
Today we have spam, phishing, and a constant litany of new attack vectors, but all of those are related at their core to the fact that all of today's popular operating systems were created before the WWW became a mission-critical part of business and government operations. The Internet and the WWW were simply too useful for business to ignore, but they were far from ready for prime-time use when business co-opted them into corporate service.
Today, the National Science Foundation wants to recreate the Internet through an experimental research network termed the Global Environment for Network Innovations (GENI), while the EU is backing Future Internet Research and Experimentation (FIRE). Major universities around the world, including CMU (100x100) and Princeton (PlanetLab) here in the United States, are all working on extensions or replacements for today's Web. And those are just a few of the major projects now under way or being proposed.
But I don't think there is much room for real debate over whether the Internet needs to be overhauled. Like global warming, only by hiding your head in the sand can you fail to see that this is broken.
I think the real question that faces all of us today is whether we are willing to continue to bet our businesses and personal privacy on a patchwork network that will be extended with some of the better ideas taken from some of these new projects. Or is time to do a cold restart and junk the Internet as it exists today (or, at most, keep it around as a legacy network just for kids to play with)?
For 20 years we've lived through what it's like to have a patchwork system in place — do we really want to put up with that for the next 50 years? Or is it time to bite the bullet and rebuild a universal network from scratch, complete with all the security protocols we can think of?