The future is already here — it's just not very evenly distributed.
- William Gibson
As the future we can sense creeping up on us approaches, some of it much faster than we expect and some frustratingly much more slowly, some uncomfortable likelihoods come to light. They are not uncomfortable for everyone, but for some, they are intensely disturbing. Many of those people are in denial about the inevitability of the future. These are the people whose conception of "security" is rapidly becoming obsolete. They are also, however, some of the most influential people in the world.
Much of this conflict between people who want to hold on to old ideas of security and a future that will push their concerns aside with irresistible indifference is centered around the difference between privacy and secrecy. As becomes increasingly obvious with the passage of time, and with the advancement of digital communication (and thus copying) technologies, privacy is security, and secrecy is not.
The essential difference between secrecy and privacy as security concepts is that secrecy attempts to hide information that can be gleaned through simple observation and analysis from others, while privacy attempts to keep communications between people from being intercepted. The two are easily conflated at times because the security technologies of privacy — including access control, encryption, and verification — are the very technologies employed in the pursuit of secrecy. Because of the fundamental inefficiencies of secrecy, however, such technologies are constantly subject to failure, and that failure often has nothing to do with the technologies used.
An excellent example of this truth in action is the WikiLeaks scandals of recent months. In particular, so-called Cablegate demonstrates a dramatic failure of the security policies of secrecy, though not a failure that should be surprising to anyone who understands basic principles of security. The upshot is that one person in the middle of one of the largest secrecy operations in the history of the world, the US government, managed to leak more than 250 thousand embassy cables to a website whose sole effective purpose is to publicly display information people try to keep secret. It is not the only site that does this. There is, in essence, an entire industry growing around this concept, encouraging whistleblowers of just about any stripe to smuggle secret data into the public awareness.
Perhaps the most amazing thing about all this noise over the matter is that WikiLeaks is such a vulnerable, unreliable avenue for distributing such leaks. The US government's campaign targeting WikiLeaks in an attempt to shut it down does not only betray the culture of secrecy in government to the public at large, undermining any claims to value transparency; it also showcases the simple fact that government officials just do not get it. WikiLeaks is not the cause of the "problem" for secretive government officials. It is merely a superficial indicator of much deeper problems — of a deeply flawed security model.
That security model maintains long-term storage of private communications, presumably for accountability purposes. It attempts to maintain the secrecy of these archives — not their privacy. The need for a means of ensuring accountability requires that people have access to the stored data, but the desire for secrecy requires prohibiting such access. A basic conflict of goals arises, and secrecy is the goal that loses out because secrecy (as opposed to privacy) is essentially untenable. The same thing happens as when DRM is cracked, because of the conflicting goals of giving people access to protected content in a convenient manner while preventing them from accessing the protected content in the manner of their choosing.
Phase Leap writer/editor Marcelo Rinesi points out the ultimate absurdity of obsessing over the "danger" to secrecy represented by WikiLeaks in a provocative short essay, The Backwardness of WikiLeaks:
I suspect their underlying mental model is that of TV stations or printing presses, which can be taken over or destroyed when needed. Very few in politics or media seem the understand that unlike tv sets and tv transmitters, all networked computers are essentially the same. Private citizens might not be able to quickly replace a shelled TV station, or Google's search infrastructure for that matter, but a cheap smartphone is perfectly capable of storing and distributing gigabytes of sensitive information.
The implications of this state of affairs are profound. We carry in our pockets the tools of mass distribution, with an ease and cheapness never known in history. It is difficult to imagine a greater ease of distribution on this world than what technology already provides us, short of what is today the utterly fantastical: species-wide mass telephathy.
In case the implications have not yet fully hit home, though, consider this statement from Rinesi's next paragraph:
The only thing that WikiLeaks provided, their unique value, lies on their well-earned ability to gather the attention of politicians and the press. The documents might have just as easily been given to, say, 4chan, who more likely than not would have proven to be even more resilient to government pressure than WikiLeaks. Or, for probably far less than the cost of hosting WikiLeaks traditionally, a botnet could have been rented to literally spam people with fragments of the documents.
Just as with the matter of copyright enforcement, the systemic failures of secrecy in government are being band-aided by short-sighted legalisms. In the case of the major copyright-based industries, the modern era of legislative secrecy began in the United States with the passage of the DMCA in the final hours of the Clinton administration. Such legalities have been used to return copyright law more forcefully to its roots in seventeenth century England as a system of censorship, as in cases where DMCA takedown notices are sent to sites where people have published the content of emails sent to them by rapacious corporations who do not want mistreated customers telling others about their experiences in dealing with said corporations.
More frightening for those of us who value the accessibility of modern communication technologies is the noises that have been made in Congress of late regarding an Internet "kill switch" (Editor's Note: See Egypt). Such a capability, assuming it is effective, would represent the single most powerful tool of censorship this country has ever known.
Even this would not be a terribly effective protector of secrecy, however. While the ease of speedy widespread distribution of any data would be greatly diminished by an Internet "kill switch", it would by no means be eliminated. Pocket-sized devices capable of storing gigabytes of data via wireless networking technologies will see to that. Even if decades of computing technologies could be erased, physical distribution of hardcopies of "secret" data would still be possible. Privacy can sometimes be effectively perfect, but secrecy is never effectively perfect once one can no longer account for the motives, security practices, and privacy technology utilization of every single individual who has access to the data. In short, once distribution is widespread enough within a context such as a government, a corporation, or an economic market, the game is over; secrecy simply is not a reasonable expectation.
The key to maintaining security under these conditions is to reorient one's perspective on security. Protect the right things — privacy, for instance — and you can maintain reasonable security. Protect the wrong things, like secrecy, and you are doomed before you begin. The shelf life of a secret, especially in large organizations, is increasingly minuscule, and effectively limited only by the quickness with which modern technology can be leveraged to distribute such secrets beyond the set of people authorized to access those secrets.
WikiLeaks is an advertisement for transparency, a gigantic billboard whose message is written in six foot tall bold-face block letters. It is telling us not only that transparency is good for the people, but that it is good for security — because any data that cannot withstand public scrutiny in broad daylight cannot be effectively kept secret. Once it moves beyond the realm of privacy, such data is in severe danger of becoming transparent to the world, whether you like it or not.
Chad Perrin is an IT consultant, developer, and freelance professional writer. He holds both Microsoft and CompTIA certifications and is a graduate of two IT industry trade schools.