I woke up this week to a news report about the debut of GE Security's “revolutionary” new shoe scanner, which apparently turned out to be anything but revolutionary. The USA Today cover story declared: “New scanner gets off on wrong foot." The article then went into detail about how 52 percent of the hundreds of passengers who passed through the machine's scanner had to remove their shoes anyway because the machine detected metal in their shoes or it could not do an electronic scan of the shoe.
Reading this, you have to wonder if any field-testing was done with this $200,000 piece of equipment. A GE spokesman said that the company is attempting to upgrade the device so it can allow passengers to wear shoes that contain harmless metal. Did I read that right? Upgrade a brand new device that's just been deployed?
This incident is the perfect illustration of risk assessment gone bad—or perhaps risk assessment never performed. High-profile IT projects are usually very expensive and also have the potential to substantially change the way an organization does business; this is why a good job of risk assessment is crucial. These projects are sometimes the crowning success of an IT manager, but often a ticket out the door. I know of more than one CIO who went down with his Enterprise Resource Planning (ERP) implementation.
The takeaway of this story is not to avoid high-profile projects, of course, but to realize that major project planning requires considerable effort in an area that many people don't like and aren't very good at.
At this point, let’s make a list of those involved with the shoe scanner implementation that now have mud on their faces: GE, the TSA (Transportation Security Administration), the Orlando Airport Authority, and the Verified Identity Pass company (who operates the device). Some people in those organizations are going to lose their jobs or have their careers stunted because of this very public failure. No organization likes to have its boo-boos on Headline News and on the cover of USA Today.
You have to wonder who was responsible for the blunder and why? I’m sure GE has intensive quality control and testing procedures, so this gaffe makes even less sense. Did a GE salesperson oversell the product? Did the buyer really understand the product’s capabilities? What kind of testing went on? I am sure there are lots of sordid details and plenty of blame to pass around. One thing is for sure—there was probably a lot of pressure for the product to be put into place in a certain time frame—from a variety of stakeholders, both within and outside of GE.
IT projects often share this time pressure and often get delivered “half baked” to the customer. Risk assessment is part of the project's risk management process in which we attempt to measure risk, and then come up with strategies to deal with it. Many IT professionals just want to get on with a project and consider the risk assessment phase a waste of time, or worse, have trouble identifying the risks associated with a project in the first place.
It's important to point out now that realizing risk assessment is important and knowing how to do it on paper does not necessarily make you good at actually doing it. Often, your familiarity with a project can work against you, and project teams can often fall into a group-think trap, convincing themselves that they have all the bases covered.
That is why every team should have a designated "naysayer"—someone who is skilled at seeing both the forest AND the trees and who can quickly identify the negative unintended consequences of our decisions. I have found this to be a rare quality and more often find it in people outside of IT, usually in the business units themselves (which is just another reason why your customers should be represented on the project team).
Risk assessment is a continual process throughout a project. Risk can insert itself at any point, particularly at decision points. Bad decisions are going to be made, of course, and sometimes they'll be made even in the face of the identified negative consequences, but it is at those points that the decision maker has to bear the consequences of his or her actions. At least a strategy will have been identified to deal with the risk—even if it turns out to be firing the decision maker.