Security engineering: A needed profession

We know what we want when it comes to our digital well-being. Policy makers know what they want. The problem is they're not close to being the same thing. Michael Kassner interviews two advocates who are trying to fix that.

Recognize the name Eric Arthur Blair? Consider this, he sold more books than any other 20th-century author, and because of him the term Orwellian was coined. I remember reading both of his oft-quoted books in high school, and the ensuing uncomfortable feeling when the teacher pointed out 1984 was just 18 years away.

I'm fortunate to know (interview) Cory Doctorow, a modern day science-fiction writer, and privacy advocate. His predictions about our digital future have rekindled the same unsettling feeling George Orwell did, so long ago.

Cory's future

Cory sees a war coming, but it's not what you think. The turf being fought over is our digital freedom and how it relates to general-purpose computing. He put it this way:

Computers are everywhere. They are now something we put our whole bodies into -- airplanes, cars -- and something we put into our bodies -- pacemakers, cochlear implants. They have to be trustworthy.

Cory likes to reference the film 2001: A Space Odyssey. Remember when Hal, the computer controlling just about everything in the space ship said, "I'm sorry Dave, I'm afraid I can't do that." Not a good thing when you're in deep space and the computer will not let you back on board the ship.

The "who overrides who" line of thought is prevalent in Cory's science-fiction writing, and even more so when he is in advocating-mode. In his presentation for Authors at Google, Cory refers to a few examples not involving Hal.

One interesting example is a GPS device with a few additional features. Say you wanted to find a seafood restaurant using the GPS. What if a steakhouse chain paid the GPS company to ensure the route you took passed by several of their establishments.

A more Orwellian example revolves around cochlear implants (courtesy of Wikipedia):

A surgically implanted electronic device that provides a sense of sound to a person who is profoundly deaf or severely hard of hearing. Cochlear implants are often referred to as a bionic ear.

Is it too far-fetched to see governments requiring the ability to monitor all conversations intercepted by the implant? In his talk, Cory described (Yahoo News report) how the Canadian government tried -- unsuccessfully -- to wire Canadian airports with listening devices specifically to record passenger conversations. It's not much of a stretch to get to personal-listening devices.

Real examples

In his paper, "Lockdown," Cory goes into detail describing real-life examples. I'm betting everyone recalls the Sony fiasco where the company covertly installed rootkits on six million audio CDs? Another, in our educational system: remember the Lower Merion School District and their webcam spy scandal?

Not their fault

Many blame the current state of affairs on politicians and government officials. I thought Cory was headed in that direction. But, he surprised me:

It's tempting to stop the story here and conclude that the problem is that lawmakers are either clueless or evil, or possibly evilly clueless. This is not a very satisfying place to go, because it's fundamentally a counsel of despair; it suggests that our problems cannot be solved for so long as stupidity and evilness are present in the halls of power, which is to say they will never be solved. But I have another theory about what's happened.

Other than the politicians themselves, Cory might be the only person feeling that way. He said he'd explain:

It's not that regulators don't understand information technology, because it should be possible to be a non-expert and still make a good law. MPs, Congress, and so on are elected to represent districts and people, not disciplines and issues.

Cory continues:

We don't have a Member of Parliament for biochemistry, and we don't have a Senator from the great state of urban planning. And yet those people who are experts in policy and politics -- not technical disciplines -- still manage to pass rules that make sense. That's because government relies on heuristics: rules of thumb about how to balance expert input from different sides of an issue.


Information technology confounds these heuristics -- it kicks the crap out of them -- in one important way.

The important tests of whether or not a regulation is fit for a purpose are first whether it will work, and second whether or not it will, in the course of doing its work, have effects on everything else.

Cory goes on to build his case in the rest of his talk. If you're not convinced, please watch the video. Right now, I'd like to introduce someone that has a possible solution.

A way out

Bruce Schneier, who needs no introduction, is well aware of what Cory is referring to. In his latest Crypto-Gram newsletter, Bruce had this to say:

This problem isn't unique to computer security, or even security in general. But this misperception about security matters now more than it ever has. We're no longer asking people to make security choices only for themselves and their businesses; we need them to make security choices as a matter of public policy. And getting it wrong has increasingly bad consequences.

Bruce provides these examples:

  • The entertainment industry wants to enforce copyright.
  • Internet companies want to continue freely spying on users.
  • Law enforcement wants its own laws imposed on the Internet.
  • Militaries want laws regarding cyber weapons, enabling wholesale surveillance, and an Internet kill switch.

Bruce then points out what he feels is the disconnect:

Elected officials will be expected to understand security implications, both good and bad, and will make laws based on that understanding. And if they aren't able to understand security engineering, or even accept that there is such a thing, the result will be ineffective and harmful policies.

Sound familiar? What I found encouraging is Bruce offering a possible solution. Bruce feels we need to establish "security engineering" as a valid profession in the minds of the public and policy makers:

  • This position is less about certifications and (heaven forbid) licensing, and more about perception -- and cultivating a security mindset.
  • We also need to engage with real-world security problems, and apply our expertise to the variety of technical and socio-technical systems that affect broader society.
  • Perhaps most importantly, we need to learn how to talk about security engineering to a non-technical audience.

Why engineering?

As one with a background in engineering, I immediately understood Bruce's reasoning. Engineering by definition is:

The science, skill, and profession of acquiring and applying scientific, economic, social, and practical knowledge; in order to design and build structures, machines, devices, systems, materials and processes.

Security by definition is:

The degree of protection to safeguard a nation, union of nations, or people against danger, damage, loss, and crime.

Doesn't that sounds like a discipline capable of building bridges between users and policy makers? Bruce is also adamant that practicing security engineers must have a say:

We need to convince policy makers to follow a logical approach instead of an emotional one -- an approach that includes threat modeling, failure analysis, searching for unintended consequences, and everything else in an engineer's approach to design.

A last word

I wanted to give both Cory and Bruce one last word on this ultra-important subject. First, Cory:

Freedom in the future will require us to have the capacity to monitor our devices and set meaningful policies for them; to examine and terminate the software processes that runs on them; and to maintain them as honest servants to our will, not as traitors and spies working for criminals, thugs, and control freaks.

Now Bruce:

Everything involves computers, and almost everything involves the Internet. More and more, computer security IS security. Powerful lobbying forces are attempting to force security policies on society, largely for non-security reasons, and sometimes in secret. We need to stand up for security.

Final thoughts

"Computer security is security" says it all. We are finally at that point, even if you don't own a single digital device, computers control your life. Science fiction or science, it's our choice.


Information is my field...Writing is my passion...Coupling the two is my mission.


"And yet those people who are experts in policy and politics — not technical disciplines — still manage to pass rules that make sense" LOL yea right, when? lol They pass stuff that makes them feel good or to get elected not what makes sense. Health care law, can anyone make sense of it? lol Nancy even admitted she could not lol


Nice to hear something positive about the need for Security Engineering professionals. I'm on a Security Engineering team and it takes a bit of effort to get activities accomplished because there are so few of us. Most security professionals are focused on the Security Operations side of things and there is a distinct difference between the two areas. Unfortunately, Security Operations gets more focus because they are "on the front lines" defending the perimeters of the organizations they are assigned to protect. Too bad that management doesn't realize that if you build security in (where Security Engineering comes into play), you just might be able to alleviate some of the stress in Security Operations. There are published resources out there for Security Engineering. These include the following: 1. SSE-CMM: Systems Security Engineering - Capability Maturity Model. Located at . The website hasn't been updated in a while, which leads to the next resource. 2. ISO/IEC 21827:2008, Information technology - Security techniques - Systems Security Engineering - Capability Maturity Model (SSE-CMM). This standard officially standardized the SSE-CMM. Information available at 3. International Council on Systems Engineering (INCOSE), Systems Security Engineering Working Group. Information on this working group is available at 4. Software Engineering Institute, Carnegie Mellon, CERT, Cyber Security Engineering. 5. NIST SP 800-27 Rev A, Engineering Principles for Information Technology Security (A Baseline for Achieving Security), Revision A. Available at 6. SABSA (Sherwood Applied Business Security Architecture). Enterprise Security Architecture seems to be a missing topic when discussing anything related to information security or cybersecurity. See


The subject disciplines have ben practiced by security proffesionals for the past decade plus. The probel is that technology advances so quickly that one is constantly in need of training and refreshers in order to stay on the leading edge. Resting ones laurels quckly get one into obsolescence and out of the mainstream. Its a difficult pursuit and requires a dedicatd professional and dedicated expenditure of resources. The ARMS process attempts to quantifiy security and risk by assigning numerical values to threats and exposures as well as effiiciency of countermeasures, methods, and means. This tool alone can save governments and business considerable funds when applied to enterprise security. Again its and engineering process requiring constant updates and refinements as well considerable intelligence about the applicabe threats to a given process, facility, procedure or what have you to protect. Let's hope recognition enables a cadre of trained System Security Engineers to work this needed discipline...


"Orwellian" refers to George Orwell, the author of 1984 (as well as several other important books). Orson Welles was an amazing actor, writer, and producer (radio's "War of the Worlds" and the movie "Citizen Kane") of a huge body (pun intended) of work. I can't believe that you have conjoined these two completely separate persons.

Michael Kassner
Michael Kassner

I never thought about the difference between engineering and operations. That is yet another distinction that makes a real difference.

Michael Kassner
Michael Kassner

I agree with you that the pieces are there. What Cory and Bruce are advocating is the professional needs "street-cred" and to be recognized as a purveyor of "how it is" rather than someone you listen to then dismiss as an alarmist.

Michael Kassner
Michael Kassner

I apologize for the mistake. And, I appreciate you pointing it out. I do happen to know who they both are, not sure why I slipped.


I had that same thought. The paragraph started out talking about George Orwell and even gets Orwellian correctly credited but then takes a left turn to Orson Welles. Huh??????


all the time. Especially since Orson Welles did "War of the Worlds" by H.G. Wells; oy-VAY!! I get 'em mixed up! :p

Editor's Picks