Security

The privacy covenant is an illusion: How to regain control

Agreement to share information in exchange for protection or services has become an irrelevant formality. Wouldn't it be nice if we could claim enough of a position of strength to make our choice to give or withhold consent a meaningful decision?

Free speech suffuses the American technologist's conception of the Internet. Startup founders grow into megacorporate information era tycoons who operate on the basic premise that their job is to facilitate communication, generating revenue on the backs of their sites' users by selling those users' information to advertisers. At the root of all this, if you press them hard enough, is always something akin to the concept of free speech. Whatever they observe, they can share freely. The end result is that privacy, for these people, is not real; it is a fantasy, an illusion, utterly obsolete in the information age. While Facebook founder Mark Zuckerberg is in many ways the poster boy for this attitude, Google is more fundamentally its embodiment on the Internet.

This approach stands in marked contrast to the typical governmental attitude toward privacy. While agents of government policy may express sentiments in private conversation that are quite similar to Zuckerberg's famous dismissals of privacy, their approach to privacy is typically much more insidious and unnerving. Where Google expands its influence so that it can cast its net ever more widely to catch whatever crosses its path and catalog it for future use, government regularly engages in activities that border on the outright illegal, and sometimes even cross that border as in the case of the Bush administration's NSA wiretapping scandal.

The justifications for these two different approaches to sweeping privacy aside in the pursuit of information- gathering goals are not immediately similar. Google, Facebook, and others provide terms of use and privacy statements to people signing up for memberships on their sites designed to get people to agree to let these corporations do whatever they want with information provided to them. They otherwise use the argument that people have voluntarily and willfully provided information as justification for their actions. By contrast, government tends to simply wave all that away; consent is irrelevant, because the urgency and importance of its activities override any privacy concerns that might restrict its actions. Quite often, such activity is in service of the War on Terror, the War on Drugs, the War on Piracy or some other "War" that, according to the agents of government policy, "requires sacrifice" from the citizenry.

Underlying both of these justifications for gathering and using information in ways that might shock and offend the people whose information is being gathered and used is the presumption of a covenant between the two parties.

In the case of corporate information gathering, there is held to be an implicit — sometimes explicit — agreement that in exchange for providing information to be used as the gatherer sees fit, certain services will be provided that benefit those whose information is used. In fact, providing that information one way or another is often a critical part of the business model, a necessary part of providing some of those services.

In the case of governmental information gathering, there is held to be an implicit "agreement" of sorts, never explicit, usually not voluntary or willful in any sense at all, but held to exist by virtue of a collective social contract. That agreement is irrevocable, and presumed to exist from the moment of birth; that government is the parent, we the children, and prying into our private lives is something mommy does for our own good. Once again, an exchange is assumed to occur, but the connection between what the citizenry gives up (or, more accurately, what is taken from the citizenry) and what is supposedly provided in return (security, freedom, and social welfare) is far more tenuous and suspect. This less-clear status of a return on involuntary investment is counterbalanced by the public perception of the dire necessity and rightful authority of government.

The covenant is key, in either case. Without the assumption of that covenant, none would stand for the circumstances that have evolved. Even with the presumption of agreement in place, many individuals object.

The result is an opt-out world. We are essentially born into a world where we are assumed to agree to everything, initially, and must explicitly retract that agreement to escape its consequences. Of course, by that time some damage has already been done. There is, as they say, little point in crying over spilled milk; the question we face is what to do now.

The cypherpunks of the 1990s had it right. The proliferation of privacy technologies that lie in the sole control of their users is the only guarantee anyone has of privacy, apart from the unreasonable options of self-imprisonment or self-exile. Great strides were made toward that proliferation in the '90s, and a key event was Philip Zimmerman's invention and legally risky distribution of an encryption tool called Pretty Good Privacy, abbreviated PGP. The legal risk that applied at the time was the US government's classification of strong cryptography technologies as "munitions" for export purposes.

Since then, encryption has become ubiquitous, and necessarily so. Banks, IRC servers, mail servers, and even Wikipedia "the free encyclopedia that anyone can edit", offer encrypted connections for authentication and communication. A number of factors have played into this, not least of which is the growing need for basic security when people conduct their online lives. We have still not reached anything like the cryptopia the cypherpunks of the 1990s envisioned for the future, however. HTTPS, the dominant Web encryption protocol of our time, is subject to compromise at the whim of the "certificate authorities" who sign the digital certificates that supposedly protect against eavesdropping on encrypted connections; most people never use the OpenPGP and S/MIME protocols to keep their digital communications private; and many of the privacy technologies provided in the most popular software applications in the world are considered laughably weak by cryptographers and other security experts.

The key to establishing and maintaining any kind of real privacy is the deployment and use of privacy technologies that users control. This requires educating the public so that people start actually caring about their privacy. This also requires development of such technologies, not only so that they work, but so that people will use them. It additionally requires the forbearance of organizations with a vested interest in preventing that proliferation of privacy technologies and the power to enact their will; government, for instance, must back off its anti-privacy laws that make it nigh-impossible for people to get away with effectively protecting their own privacy. The "guilty until proven innocent" approach government takes to deciding when it is justified in using the law as a truncheon to punish those who just want to keep their mouths shut about their personal business has a chilling effect on how people view their own privacy. Its laws against the free development and distribution of such technologies may be even more difficult for society as a whole to overcome.

The privacy covenant in the information age — the supposed agreement that we only give up privacy to the extent we desire, and only in exchange for something of equal or greater value — is indeed an illusion. The truth of the matter is that anyone in a position to gather information is more likely than not to find a way to (self-)justify getting it, by hook or by crook, whether we want that person to have it or not. Our only hope for regaining any control over our privacy is to enforce it by the only law we have at our disposal; technology that the user controls. This means open source encryption software, more often than not, as a key part of the package.

It also means that the single most important thing we can do to change government policy so that our privacy may be protected is not to establish restrictions on searches and seizures, such as limitations on wiretap power. It is to simply get government to back off its regulation of privacy technologies. If you are a pro-privacy activist, the lion's share of your efforts should undeniably be directed at encouraging development of privacy technologies under the most open licenses possible and at eliminating any law on the books that restricts the creation and distribution of such technologies.

If we can do that, we can start thinking about how to get everybody on board with the idea of actually taking reasonable steps to protect their own privacy. Once that is underway, we can finally re-establish a privacy covenant with those whose best interests are served by learning everything about us, this time with the tools in our hands necessary to enforce that covenant.

About

Chad Perrin is an IT consultant, developer, and freelance professional writer. He holds both Microsoft and CompTIA certifications and is a graduate of two IT industry trade schools.

Editor's Picks