id="info"

Software

The security control nobody used...

Not every security control is successful, particularly those not transparent to business users. This is the story of a failed attempt to encrypt email and the lessons learned.

The Story

Once upon a time, a large enterprise decided it was time to encrypt sensitive information sent via email.  They were regulated by both HIPAA and SOX, so getting executive management on board was no problem.  The security team practiced due diligence, reviewing multiple offerings, speaking with Gartner analysts, and discussing technical challenges with engineering.  (Security and engineering are hereafter referred to as the "technical team.")  The proposal was completed, submitted, and approved.

The solution implemented included the following:

  1. Automatic outbound message encryption.  Messages were encrypted when the content filtering engine, referencing HIPAA and PII lexicons, calculated a score greater than the threshold set.
  2. Encrypted messages could be delivered to recipients as password-protected attachments.  The solution also supported sending affected messages to an online mailbox in the enterprise data center, forwarding a notification to the recipient, and requiring the recipient to log in to the online mailbox to retrieve the message.  Remote access was via SSL.  However, senior management thought this was too much trouble for vendors, customers, etc.   They directed the technical team to go the attachment route.
  3. Manual encryption was possible by marking the message "confidential."

There were other features as well, but they're not important to our story.

The pilot began, involving IS personnel only.  Everyone thought it was a great product, except too many messages were being encrypted.  This was inconvenient for the recipients.  So, the technical team adjusted the lexicon scores and the overall message score threshold, trying to balance security with convenience.  By the time they were done, the scale had shifted to allowing quite a few messages that should have been encrypted to pass in plain text.

The pilot finished with IS satisfied this was the right solution.  Training videos had been distributed to all email users, with accompanying quick-reference cards.  All was ready.  So on a bright, sunny morning, they flipped the switch.  Everyone now had encrypted email, and the fun began.

The first wave of complaints came from executive management.  It was too inconvenient working with their intended recipients, getting them to understand how to receive their email.  Further, certain senior managers didn't believe their email should be subject to auto-encryption.  The technical team responded, turning off auto-encryption, leaving executives to decide whether or not to encrypt each message.

This was followed by an upswell of frustration across the enterprise, as users rebelled against the oppressive tyranny of auto-encryption.  So, without management support for auto-protection, the technical team turned it off, relying on users to encrypt when they thought it necessary to protect sensitive information.

Several quiet months passed.  Then one day the Legal department called.  It seemed they hadn't been using the encrypted mail system, and they were concerned about possible ePHI compromise.  Further investigation by the technical team revealed the encrypted email attachment, the protected message, was being stripped by most receiving email systems.  It couldn't be scanned for malware.  Go figure.

Checking with other departments, the technical team discovered that, with the exception of a few people in IS, no one was actually using the system.  New employees didn't know it existed, management didn't enforce compliance, and many outside entities wouldn't accept encrypted attachments.  The end.

The Moral

This story has a lesson... well maybe more than one.

  1. Piloting a security control not transparent to business users, that significantly affects the way they do work, should actually be tested by business users.  Restricting testing of such a solution to IS personnel is a big mistake.
  2. Management must fully understand the business impact and be willing to enforce use of the control.  This means the technical team should help management understand how the solution works, from the users' perspective, before purchase and implementation.  Expecting executive enforcement of a solution that results in a series of unwanted surprises is unreasonable.

There were probably other mistakes made, but these two were enough to render the message encryption solution a failure, a control nobody used.

About

Tom is a security researcher for the InfoSec Institute and an IT professional with over 30 years of experience. He has written three books, Just Enough Security, Microsoft Virtualization, and Enterprise Security: A Practitioner's Guide (to be publish...

Editor's Picks