Software

The security control nobody used...

Not every security control is successful, particularly those not transparent to business users. This is the story of a failed attempt to encrypt email and the lessons learned.
The Story

Once upon a time, a large enterprise decided it was time to encrypt sensitive information sent via email.  They were regulated by both HIPAA and SOX, so getting executive management on board was no problem.  The security team practiced due diligence, reviewing multiple offerings, speaking with Gartner analysts, and discussing technical challenges with engineering.  (Security and engineering are hereafter referred to as the "technical team.")  The proposal was completed, submitted, and approved.

The solution implemented included the following:

  1. Automatic outbound message encryption.  Messages were encrypted when the content filtering engine, referencing HIPAA and PII lexicons, calculated a score greater than the threshold set.
  2. Encrypted messages could be delivered to recipients as password-protected attachments.  The solution also supported sending affected messages to an online mailbox in the enterprise data center, forwarding a notification to the recipient, and requiring the recipient to log in to the online mailbox to retrieve the message.  Remote access was via SSL.  However, senior management thought this was too much trouble for vendors, customers, etc.   They directed the technical team to go the attachment route.
  3. Manual encryption was possible by marking the message "confidential."

There were other features as well, but they're not important to our story.

The pilot began, involving IS personnel only.  Everyone thought it was a great product, except too many messages were being encrypted.  This was inconvenient for the recipients.  So, the technical team adjusted the lexicon scores and the overall message score threshold, trying to balance security with convenience.  By the time they were done, the scale had shifted to allowing quite a few messages that should have been encrypted to pass in plain text.

The pilot finished with IS satisfied this was the right solution.  Training videos had been distributed to all email users, with accompanying quick-reference cards.  All was ready.  So on a bright, sunny morning, they flipped the switch.  Everyone now had encrypted email, and the fun began.

The first wave of complaints came from executive management.  It was too inconvenient working with their intended recipients, getting them to understand how to receive their email.  Further, certain senior managers didn't believe their email should be subject to auto-encryption.  The technical team responded, turning off auto-encryption, leaving executives to decide whether or not to encrypt each message.

This was followed by an upswell of frustration across the enterprise, as users rebelled against the oppressive tyranny of auto-encryption.  So, without management support for auto-protection, the technical team turned it off, relying on users to encrypt when they thought it necessary to protect sensitive information.

Several quiet months passed.  Then one day the Legal department called.  It seemed they hadn't been using the encrypted mail system, and they were concerned about possible ePHI compromise.  Further investigation by the technical team revealed the encrypted email attachment, the protected message, was being stripped by most receiving email systems.  It couldn't be scanned for malware.  Go figure.

Checking with other departments, the technical team discovered that, with the exception of a few people in IS, no one was actually using the system.  New employees didn't know it existed, management didn't enforce compliance, and many outside entities wouldn't accept encrypted attachments.  The end.

The Moral

This story has a lesson... well maybe more than one.

  1. Piloting a security control not transparent to business users, that significantly affects the way they do work, should actually be tested by business users.  Restricting testing of such a solution to IS personnel is a big mistake.
  2. Management must fully understand the business impact and be willing to enforce use of the control.  This means the technical team should help management understand how the solution works, from the users' perspective, before purchase and implementation.  Expecting executive enforcement of a solution that results in a series of unwanted surprises is unreasonable.

There were probably other mistakes made, but these two were enough to render the message encryption solution a failure, a control nobody used.

About

Tom is a security researcher for the InfoSec Institute and an IT professional with over 30 years of experience. He has written three books, Just Enough Security, Microsoft Virtualization, and Enterprise Security: A Practitioner's Guide (to be publish...

2 comments
mford66215
mford66215

Who hasn't? ANY initiative that doesn't positively impact the user experience requires executive mandated compliance to be used. And then it will only be used until the staff figures out a way around it. Not only does the solution have to have informed support from on high, it should be tailored to improve the users experience. That's not always possible, but it should be a goal for all IT initiatives - no matter where the directive came from.

Jeff Dickey
Jeff Dickey

Fatal Flaw #3: Since this whole Rube Goldberg scenario was to build a CYA legal framework for the company, the legal department should have been actively involved from the beginning. The fact that they weren't, or that other people with an understanding of the details of SOX and HIPAA weren't involved, actively prevented the process from succeeding under any circumstance. Fatal Flaw #4: Changes to the implemented process, especially relaxing the encryption to the degree that it was, should have been documented, with top management and Legal getting traceable copies. "If it's not written down, it never happened," as the saying goes. At least it never happened in a way that's guaranteed to be perfectly reproducible or auditable. Fatal Flaw #5: As user complaints started coming in, especially from top managers, IT and Legal should have had regular interaction with these users so that everybody understood what the system was for and why it was the way it was, and ideas could be put together to improve/simplify the process while still enforcing regulatory/legal compliance. The fact that this was not done in a transparent, documented manner, guaranteed the failure of the system.

Editor's Picks