In today’s world of increasing threats targeting our data for profit as well as the spread of governmentally imposed constraints, I believed that all major software vendors had gotten the message—practice due diligence in making your software secure. However, I was recently disillusioned.
Several months ago, we selected an application for one of our lines of business. The selection was based both on functional and technical specifications, including security. We were told at the time that users wouldn’t have direct access to the database, which will contain HIPAA and SOX regulated information. That would be violation of what I call the Integrity Rule. The vendor team who sold us the product was wrong.
The Integrity Rule includes a requirement that read and write access can only be allowed via an application interface. This provides for a layer of controls between the user and the data as well as providing full transaction tracking capability. Practically, this means that only database administrators or data engineers should be able to modify tables without going through the application. Enter our application vendor.
The application, which will remain nameless, has two methods of authentication: AD account pass-through and SQL. Pass-through authentication allows a user to automatically authenticate to the application using her Active Directory credentials. In this scenario, all users have public access to all tables if they establish a connection to the database server. Nice security… all users that have access to the application have full access to the database tables outside the application.
With SQL authentication, a single service account logs in to the database and provides the communication channel between the application and the data. So far, so good. The problem arises when we examine the implementation of the database. All tables are still open for public access.
Placing the database server within a tightly controlled VLAN, with only the DBAs having access directly to it, seemed to be a good workaround. However, we need to provide read only access to the application support team. Allowing them any access to the database provides them with read/write access. Our auditors would have a field day with this.
We took these issues to the vendor. The first response was that none of their other customers were complaining—inferring that we were just whiners. After we escalated beyond this very helpful individual, we had a conversation with one of the vendor’s engineers. After we spent about 15 minutes trying to explain why their approaches to security were a bad idea, we asked if we couldn’t just restrict public access to the service account in the SQL authentication scenario. This would allow us create separate, read only accounts for the application support team. At least we thought this was a good idea.
The engineer pushed back a little and told us that he’d have to check with the developers to see if this would work. He wasn’t familiar with this approach.
I believe this will work itself out. However, I’m concerned that a company that markets software to publicly traded healthcare companies isn’t more in tune with SOX and HIPAA requirements. It’s also a lesson for us that just because a vendor account team smiles, nods, and affirms HIPAA and SOX compliance, they might actually have no idea what we’re talking about.
Tom is a security researcher for the InfoSec Institute and an IT professional with over 30 years of experience. He has written three books, Just Enough Security, Microsoft Virtualization, and Enterprise Security: A Practitioner's Guide (to be published in Q1/2013). Before joining the private sector, he served 10 years in the United States Army Military Police with four years as a military police investigator. He has an MBA and CISSP certification. He is also an online instructor for the University of Phoenix.