One of the most painful activities associated with the release of software is the certification and accreditation (C&A) process, but it doesn’t have to be. By gathering the appropriate information—application details, schemas, architectural diagrams, and other development artifacts—beforehand, program managers can greatly reduce the C&A time requirement. This not only reduces the time requirement, it also helps security analysts gain a better understanding of your application, and ensures the accuracy of the accreditation effort. Furthermore it lets security analysts know that you have done your homework and understand the security requirements associated with your application. Figure A gives a general overview of a simple security architecture.
|A security design|
At a minimum, be prepared to provide the following:
- The impact to the users or mission if the system fails
- The type of data the system processes
- An architectural drawing depicting a logical view of the system
- A list of system components as well as version numbers of commercial software
- Entity relationship diagrams and process models
- A list of user roles and privileges associated with these roles
- Functional test scripts for any security mechanism built into the application
- Application guides—such as user, system administrator, and implementation guides
I am amazed how often development teams do not understand the criticality of the system being developed—even after development is completed. While this should be determined during the design phase, it is more frequently identified only after C&A begins. If you’ve already built the bulk of the application—or worse, if you’ve completed the application—but do not know the criticality level, don’t let the security analyst derive it. Sit down with the customer, and determine the potential impact to the customer’s mission should the application fail. Is there the potential for loss of life if the system’s security mechanisms fail? Air traffic control, law enforcement reporting, and ambulance dispatch systems are good examples of systems possessing high confidentiality, integrity, and availability (CIA) requirements because of the potential for loss of life.
Financial systems also have high CIA requirements although rarely is there a potential for loss of life; instead there is the potential for severe financial loss. As you begin to determine the criticality of the system, you should view this from three different angles:
- What is the impact if the system becomes unavailable?
- What is the impact should the data be inaccurate?
- What is the impact if data is disclosed to unauthorized individuals?
Know what type of data your application processes and the rules governing handling of that data. A good example is Privacy Act data. While this is commonly thought of as nothing more than a means of protecting social security numbers, the privacy requirement actually extends to any unique identifier that can be used to tie an individual to specific attributes; for example, a student id number and corresponding grades.
Certain types of data require special handling; such as requiring that all output (e.g., screens and reports) be marked with a Privacy Act banner. Other special types of data include healthcare information subject to the Healthcare Information Portability and Accountability Act (HIPAA), financial data subject to CFO requirements, and Company Proprietary data subject to corporate policy. There are also government classifications: For Official Use Only (FOUO), Unclassified Sensitive, Secret, etc., all of which require special handling and protection as well. Be prepared to explain how your system protects the data. You should also be prepared to detail how output is marked and who can override automatic markings—especially if you’re using an ad hoc query tool such as Oracle Discoverer.
A logical view
A logical view of the application depicting data flow is worth its weight in gold, especially if it identifies which elements participate in which parts of the security process. For example, a logical diagram depicting which component handles user identification and authentication (how the user logs in to the system), which mediates access control (via permissions or roles), and where auditing occurs is incredibly valuable to a security analyst. This provides one method of presenting your system’s security architecture.
Often times there are components outside the application’s configuration control—for example, if an outside Internet Service Provider (ISP) hosts your application, or if your application is on a shared host—you may not be able to control certain aspects of the application’s configuration. Even if you cannot control configuration of a component—if it is a part of your system—be sure to include it in the diagram.
A physical view
A physical view of the application is also important. This differs from the logical view by depicting specifics about hardware and software. For example, the physical view would state: The Web server uses IIS 5.0, running on a Windows 2000 Server Operating System, the database uses the MSDE 2000 runtime engine, and the client is browser-based requiring 128-bit encryption capable browser such as Internet Explorer 5.5 or Netscape 7.0. Physical views are straightforward—essentially nothing more than a graphic representation of the required components.
Entity relationship diagrams (ERDs) and business process models help a lot. These help determine relationships between critical data elements as well as identify how data is stored. They also equip security analysts with an in-depth understanding of how your application operates. A database schema is also helpful, and should go hand in hand with the ERD. If you are storing passwords in your database, be sure to annotate how they are protected.
Passwords should be stored in an encrypted form. Be prepared to explain how they’re encrypted and decrypted, as well as where they’re stored in decrypted form—for example, stored in public or global variables at run time. Also be prepared to explain how the encryption key is protected. As a rule, never store the key together with the encrypted string. It’s a good idea to have the encryption/decryption process and password usage documented, instead of leaving this to interpretation.
There are two basic types of roles—trusted officials and functional users. However, most applications require various levels of access within a system to be assigned to different types or classes of users. All roles should be clearly delineated as well as what capabilities or privileges are assigned to each type of user or role. For example, in a video store application, a supervisor would have the ability to override a checkout price but a checkout clerk probably would not.
One development team gave me a list of roles and capabilities using an excel spreadsheet. This shaved quite a bit of time off the C&A effort. I didn’t have to derive roles and privilege information from the database configuration and source code review.
If you have built security mechanisms into your application, chances are that you’ve already written test scripts or scenarios to test them. Be sure to provide these test scripts and the results to the security analyst. This can also reduce C&A time by eliminating duplicate testing.
One of the most time consuming tasks within the C&A effort is creation of relevant security test scripts. In order to ensure security mechanisms are fully tested, the security analyst often must spend an exorbitant amount of time studying the application, its design, components, processes and data relationships. He can then create test scripts, verifying the system’s ability to sufficiently protect its resources and data.
Because security testing requires an in-depth understanding of the application—providing user’s manuals, administrator guides, and installation instructions help expedite the learning process. All too often, security analysts spend much of the time allocated in the schedule waiting for information. Most of the necessary information can be gleaned from these documents.
The key to streamlining the C&A process is preparation. Collecting relevant development artifacts beforehand significantly reduces the amount of time required to complete a C&A package. Preparation substantially reduces the potential for failure in addition to greatly reducing the frustration level for both security analyst and program manager alike. Shortening the completion time and reducing the pain associated with the C&A effort ends the C&A process on a positive note.