Getting paid to break into things: How vulnerability assessors work at Argonne National Lab

Let's face it. Deterrents such as "keep out" or "do not open" are powerful magnets to us techies. Now, imagine getting paid to ignore those warnings.

I remember my first lock-picking experience. ...It was a dark and stormy night. Just kidding. While celebrating my ninth birthday at a fancy restaurant with my parents, I needed to attend to something. Making sure to excuse myself, I headed for the men's room.

Checking out the graffiti, while waiting for things to happen, I glanced at the toilet-paper dispenser. I noticed it was locked. With nothing better to do at that moment, I pulled out my prized possession, a totally-cool Swiss Army knife and got to work.

After an inordinate amount of time, my father came in to check on me. He asked, "Is there a problem?" With a smile, I said, "At first, but I figured it out."

Of similar mind: Vulnerability Assessments Team

I'd like to introduce you to a group of people who would have the dispenser lock open in no time. They are members of the Vulnerability Assessments Team (VAT) at Argonne National Laboratory. They're the ones who break into stuff that supposedly cannot be broken into.

I learned about VAT indirectly. Steve Gibson in one of his Security Now podcasts was talking about a list of Security Maxims written by Dr. Roger G. Johnston. Dr. Johnston described where his words-of-wisdom came from:

"Being a vulnerability assessor makes one pretty cynical. Or maybe you need to be cynical to see security problems. Or maybe both are true. Anyway, these maxims were developed partially out of frustration at seeing the same kinds of problems over and over again."

The doctor's list of maxims really resonated with me. So much so, I wrote an article about his aphorisms. Here are a couple of my favorites:

  • Too Good Maxim: If a given security product, technology, vendor, or techniques sounds too good to be true, it is. And it probably sucks big time.
  • Scapegoat Maxim: The main purpose of an official inquiry after a serious security incident is to find somebody to blame, not to fix the problems.

While quizzing Dr. Johnston about his maxims, I kept hearing about a group called VAT and the incredible things they were doing. Being more of a tortoise than a hare, it took me awhile. But, I finally realized the need for another article.

The experts at VAT

After several phone calls and emails, I finally got the scoop on Dr. Johnston and the VAT. What I learned is impressive. Consider the mission:

"The VAT works extensively in the areas of product anti-counterfeiting, tamper and intrusion detection, cargo security, nuclear safeguards, and the human factors associated with security using the tools of industrial and organizational psychology."

If that isn't enough:

"The VAT also runs a one-stop microprocessor shop where Argonne scientists and researchers can have a microprocessor solution - hardware and software - for analog or digital measurements in about a week."

This VAT fact sheet describes some of its recent accomplishments: How to detect a sticky bomb (very cool), how to determine where biometrics and access-control devices are vulnerable, and how to secure a secret key are some of the more notable.

It's a little late for me, but I was compelled to ask Dr. Johnston: How does one get a job like his? Also, I was anxious to learn how good the team members were at picking locks. Here is what he had to say:

How does one prepare for a job like yours? Johnston: I have no idea! Like a lot of people in the security business, I stumbled into the field when somebody retired and they needed a replacement. The "tools" I think one has to develop to be a good vulnerability assessor are mostly mental. They include:
  • Skepticism; having the desire to check things out for one's self, as opposed to automatically believing the canonical view.
  • Strong BS meter, intuition, and creativity; really wanting to find security problems and solutions (rather than reassuring yourself that everything is fine).
  • Not being afraid to rock the boat.
  • The ability to think like a bad guy, possess some degree of intrinsic evil.
  • Hacker's mentality that involves trying to devise ways to defeat things.

It is helpful that many team members, including myself, have a physics background. Besides the applicable technical knowledge, physicists tend to believe even intricate systems operate under simple, understandable principles. This is a good mindset when facing complex security applications.

Engineering is not typically a good background for this type of work. Engineers often have the wrong mindset for thinking like the bad guys. This may be why most devices and systems have poor security.

Kassner: I know my friends who are engineers are going to be irritated. So, I asked Dr. Johnston to explain:

"There is an old saying: When you're holding a hammer, everything looks like a nail! Engineers and computer scientists look at security from a completely different perspective than the people trying to break in."

If you think about it, creating something and trying to destroy it are on opposite sides of the spectrum.

What is a day at the VAT labs like? Johnston: It sure beats working for a living. Every day is different. On any given day we may:
  • Work on the bench, testing hardware or microprocessor circuits.
  • Work in the field, testing attacks and investigating security programs.
  • Reverse engineer software, including microprocessor code.
  • Produce videos and training materials that demonstrate attacks, potential countermeasures, product redesigns, and suggested security protocols for our sponsors and security professionals.
  • Write research papers. I serve as editor of the Journal of Physical Security.

I also meet with government officials, private security managers, and give talks at conferences, trying to raise awareness of security issues. This tends to be an uphill battle because Security Theater, cognitive dissonance, and denial are difficult to compete against.

What is the easiest and most difficult part of your job? Johnston: Figuring out vulnerabilities is always the easy part. There are so many of them. And the same security blunders keep cropping up across a wide range of security devices and systems (locks, tags, seals, biometrics, and other access control devices).

Next, determining practical, cost-effective countermeasures is a bit harder, but fairly straightforward.

The hard part of being a vulnerability assessor is figuring out how to deliver the "bad news". Generally, we start out discussing what is good about the security because:

  • We want the dialogue to continue.
  • The good security features might have been an accident and we want them to continue.
  • We found this helps the customer be better prepared to hear about problems.
Kassner: This struck me as odd. Why would an organization that asked for help have problems with what VAT found? It's like asking for advice, then arguing with the person who gave it. So, I quizzed Dr. Johnston about this:

"It can be a political hot potato, especially if the product we are testing is already in service. Telling a company that one of their products has issues is something they do not want to hear. That is why we prefer working with companies when the product is in the design stage."

The project you thought would be hard but wasn't? Johnston: I thought nuclear safeguards would be a challenging area to find vulnerabilities. But, it seems, the amount of careful thought devoted to security is not proportional to the importance of an application. The same kind of dumb mistakes can be found in nuclear safeguards as in other security applications.

"Thugs versus nerds"?

Dr. Johnson used the expression "thugs versus nerds" several times during our conversations. I laughed when I heard it, but did not truly understand what it meant. You might find Dr. Johnston's explanation interesting:

"People concerned with physical security have a completely different mindset than those dealing with cyber security. For example, it only takes a few break-ins to get a lock maker concerned. Yet, on any given day, security administrators expect that several computers will get broken into.

That difference in threshold makes things interesting when you get both types together. I have been in meetings where a company wants to consolidate physical and cyber security into one group. That's not possible, unless the person in charge understands the different mentalities in play."

More physical than cyber

While trying to make sense of my notes, I was struck by one thing. Though Dr. Johnston and the VAT members work with sophisticated electronics, they usually end up breaking in using simple physical methods.

That's not what I expected so I thought maybe I was mistaken. I called Dr. Johnston and asked for clarification. He said, "Yep, why bust your butt trying to break in electronically, when it is easier and faster using physical means." Dr. Johnston went on, "Cyber security is in a pretty good place. It's physical security that needs help."

My notes were right, but the hands-on approach did not seem easier to me. He explained:

"Any break-in to a high-profile target requires significant preparation. You do not want to get caught and you want the exploit to work. A cyber break-in may work, but remember that's where the attack is expected and where the most defenses will be placed.

We prefer the physical approach. All we need is 15 seconds with a piece of equipment--a router--to install some hardware programmed to create for instance, a Man in the Middle attack."

I next asked how they would get their hands on the router. Dr, Johnston replied, "Ever hear the term ‘chain of custody'?" I mentioned I did and he continued:

"For the most part, keeping careful track of devices is not commonly done. During any one of the stops from the manufacturer to the data center, we could bribe or social engineer someone to get our hardware installed. Or just install it ourselves during transport or while it is sitting on a loading dock somewhere, or even after it is installed since physical security for cyber hardware is often not sufficient."

If you aren't convinced, I suggest you watch this video. Dr. Johnston and VAT members used a similar approach to compromise electronic voting machines.

Final thoughts

Dr. Johnston and VAT members have a serious task and, dare I say, lots of work ahead of them. Thankfully they are up to it. I would like to thank Dr. Johnston for explaining the ins and outs of being a vulnerability assessor.

On a more humorous note, Dr. Johnston has managed to acquire so many security maxims that he has written a new book, Security Sound Bites: Important Ideas About Security from Smart-Ass, Dumb-Ass, & Kick-Ass Quotations.

Almost forgot, I wanted to thank the good Doctor for reviewing my resume and his sincerity when mentioning I should keep my day job as an engineer.