Recognize the name Eric Arthur Blair? Consider this, he sold more books than any other 20th-century author, and because of him the term Orwellian was coined. I remember reading both of his oft-quoted books in high school, and the ensuing uncomfortable feeling when the teacher pointed out 1984 was just 18 years away.

I’m fortunate to know (interview) Cory Doctorow, a modern day science-fiction writer, and privacy advocate. His predictions about our digital future have rekindled the same unsettling feeling George Orwell did, so long ago.

Cory’s future

Cory sees a war coming, but it’s not what you think. The turf being fought over is our digital freedom and how it relates to general-purpose computing. He put it this way:

Computers are everywhere. They are now something we put our whole bodies into — airplanes, cars — and something we put into our bodies — pacemakers, cochlear implants. They have to be trustworthy.

Cory likes to reference the film 2001: A Space Odyssey. Remember when Hal, the computer controlling just about everything in the space ship said, “I’m sorry Dave, I’m afraid I can’t do that.” Not a good thing when you’re in deep space and the computer will not let you back on board the ship.

The “who overrides who” line of thought is prevalent in Cory’s science-fiction writing, and even more so when he is in advocating-mode. In his presentation for Authors at Google, Cory refers to a few examples not involving Hal.

One interesting example is a GPS device with a few additional features. Say you wanted to find a seafood restaurant using the GPS. What if a steakhouse chain paid the GPS company to ensure the route you took passed by several of their establishments.

A more Orwellian example revolves around cochlear implants (courtesy of Wikipedia):

A surgically implanted electronic device that provides a sense of sound to a person who is profoundly deaf or severely hard of hearing. Cochlear implants are often referred to as a bionic ear.

Is it too far-fetched to see governments requiring the ability to monitor all conversations intercepted by the implant? In his talk, Cory described (Yahoo News report) how the Canadian government tried — unsuccessfully — to wire Canadian airports with listening devices specifically to record passenger conversations. It’s not much of a stretch to get to personal-listening devices.

Real examples

In his paper, “Lockdown,” Cory goes into detail describing real-life examples. I’m betting everyone recalls the Sony fiasco where the company covertly installed rootkits on six million audio CDs? Another, in our educational system: remember the Lower Merion School District and their webcam spy scandal?

Not their fault

Many blame the current state of affairs on politicians and government officials. I thought Cory was headed in that direction. But, he surprised me:

It’s tempting to stop the story here and conclude that the problem is that lawmakers are either clueless or evil, or possibly evilly clueless. This is not a very satisfying place to go, because it’s fundamentally a counsel of despair; it suggests that our problems cannot be solved for so long as stupidity and evilness are present in the halls of power, which is to say they will never be solved. But I have another theory about what’s happened.

Other than the politicians themselves, Cory might be the only person feeling that way. He said he’d explain:

It’s not that regulators don’t understand information technology, because it should be possible to be a non-expert and still make a good law. MPs, Congress, and so on are elected to represent districts and people, not disciplines and issues.

Cory continues:

We don’t have a Member of Parliament for biochemistry, and we don’t have a Senator from the great state of urban planning. And yet those people who are experts in policy and politics — not technical disciplines — still manage to pass rules that make sense. That’s because government relies on heuristics: rules of thumb about how to balance expert input from different sides of an issue.

But:

Information technology confounds these heuristics — it kicks the crap out of them — in one important way.

The important tests of whether or not a regulation is fit for a purpose are first whether it will work, and second whether or not it will, in the course of doing its work, have effects on everything else.

Cory goes on to build his case in the rest of his talk. If you’re not convinced, please watch the video. Right now, I’d like to introduce someone that has a possible solution.

A way out

Bruce Schneier, who needs no introduction, is well aware of what Cory is referring to. In his latest Crypto-Gram newsletter, Bruce had this to say:

This problem isn’t unique to computer security, or even security in general. But this misperception about security matters now more than it ever has. We’re no longer asking people to make security choices only for themselves and their businesses; we need them to make security choices as a matter of public policy. And getting it wrong has increasingly bad consequences.

Bruce provides these examples:

  • The entertainment industry wants to enforce copyright.
  • Internet companies want to continue freely spying on users.
  • Law enforcement wants its own laws imposed on the Internet.
  • Militaries want laws regarding cyber weapons, enabling wholesale surveillance, and an Internet kill switch.

Bruce then points out what he feels is the disconnect:

Elected officials will be expected to understand security implications, both good and bad, and will make laws based on that understanding. And if they aren’t able to understand security engineering, or even accept that there is such a thing, the result will be ineffective and harmful policies.

Sound familiar? What I found encouraging is Bruce offering a possible solution. Bruce feels we need to establish “security engineering” as a valid profession in the minds of the public and policy makers:

  • This position is less about certifications and (heaven forbid) licensing, and more about perception — and cultivating a security mindset.
  • We also need to engage with real-world security problems, and apply our expertise to the variety of technical and socio-technical systems that affect broader society.
  • Perhaps most importantly, we need to learn how to talk about security engineering to a non-technical audience.

Why engineering?

As one with a background in engineering, I immediately understood Bruce’s reasoning. Engineering by definition is:

The science, skill, and profession of acquiring and applying scientific, economic, social, and practical knowledge; in order to design and build structures, machines, devices, systems, materials and processes.

Security by definition is:

The degree of protection to safeguard a nation, union of nations, or people against danger, damage, loss, and crime.

Doesn’t that sounds like a discipline capable of building bridges between users and policy makers? Bruce is also adamant that practicing security engineers must have a say:

We need to convince policy makers to follow a logical approach instead of an emotional one — an approach that includes threat modeling, failure analysis, searching for unintended consequences, and everything else in an engineer’s approach to design.

A last word

I wanted to give both Cory and Bruce one last word on this ultra-important subject. First, Cory:

Freedom in the future will require us to have the capacity to monitor our devices and set meaningful policies for them; to examine and terminate the software processes that runs on them; and to maintain them as honest servants to our will, not as traitors and spies working for criminals, thugs, and control freaks.

Now Bruce:

Everything involves computers, and almost everything involves the Internet. More and more, computer security IS security. Powerful lobbying forces are attempting to force security policies on society, largely for non-security reasons, and sometimes in secret. We need to stand up for security.

Final thoughts

“Computer security is security” says it all. We are finally at that point, even if you don’t own a single digital device, computers control your life. Science fiction or science, it’s our choice.