Do you trust your developers to write hacker-proof code? Or do you depend on your network administrator to keep your software assets secure? Read what these software developers think about security audits and add your two cents to the discussion.
Who has the ultimate responsibility for the security of a business application: the developer or the network administrator? On one hand, you might assume that applications developed for "internal use only" are safe from intruders as long as network access is adequately guarded. On the other hand, security for Web applications and other programs with public access requires that developers pay particular attention to security issues.
How can you be certain your systems are secure? Hiring a security specialist to test your systems is one way to make sure your developers are putting "bulletproof" applications into production.
Security at the network level
Ed Scott is director of software engineering for a provider of voice-enabled notification services. Although the information managed by his company is highly confidential, Scott has never worked with an internal security auditor or a third-party security specialist.
Scott says his company doesn’t assume its developers write hacker-proof code. “Most of our focus on security is at the network and system level rather than at the software development level," he said.
A change in the database platform, however, has directly affected the way Scott's team of developers writes code. "Now that we are using Oracle rather than Pervasive, we have to be careful about handling database passwords," Scott said. “With Pervasive, the databases were protected at the operating system and file level, and we did not use database passwords."
Scott says he has an idea what a security specialist might uncover in an audit of his shop. "There have been requests from some of our customers to build more advanced security features into our products," he said. "One example would be autoexpiration of passwords after a defined amount of time. Most of our applications do not force passwords to be changed at regular intervals."
Password protection is one of the most serious security issues relating to software development. In his book Information Security Policies Made Easy, 8th Edition (PentaSafe Security Technologies, 2001), Charles Cresson Wood writes that "passwords must never be hard-coded (incorporated) into software developed by or modified by Company X workers." According to Wood, storing security passwords "in system tables or some other nonsoftware location,” such as a separate table, is preferred because it allows the organization to modify system access parameters in response to security threats.
Rob Valliere is a consultant who manages Web development and Web site production projects on Windows and Linux platforms in Thailand and Southeast Asia. Asked about security issues that exist at the code level, Valliere said, "The software must be catalogued, documented, and backed up to secure media.” Valliere noted that he is often hired by companies to replace missing source code.
In Valliere’s opinion, developers needn’t rule out the possibility of hard-coding server names or passwords into the scripts or applications. “Critical names and passwords should be stored away from code in secure config or database files, so they can then be changed easily and globally.
"With Web applications, server script files like ASP and PHP files are generally safe from Web or FTP users," Valliere said. "I put these critical variables in a single server script file and include this file in application files. Sometimes, for added security, the script file is not stored on the Web root or is stored in a secure SQL database."
Hiring a hacker
Valliere doesn't think every company that develops software requires a security audit, just "larger companies with the budget and with security concerns." In most shops, Valliere said, "developers or network/Web administrators end up doing this function."
So should you hire a hacker to test the security of your applications? Bryan Schardein, a software engineer who works for Ed Scott, says, "If you're creating some type of Internet Web or file server, you'll want peer review amongst your developers to find bugs that might compromise security, as well as to discourage deliberate exploits. But you'll also want to hire external agents like hackers but those that have a well-developed reputation in the professional software development community—not your son's friend from high school."
Schardein added another skill set to the wish list: "The security analyst should have not only a strong background in development, but also [in] networking protocols, operating system, and file system design and architecture.
"Whether this background stems from formal education or on-the-job training isn't particularly relevant,” Schardein said. “The important thing is that they are familiar with the types of attacks that malevolent entities are likely to attempt, how they exploit weaknesses in the network, operating system, or application software, and how to detect such weaknesses."
The role and rights of the security analyst
Once you make the decision to hire a specialist to evaluate potential security risks in your system, what then? According to Schardein, "It would not be uncommon to start by giving the auditor just as much information about the system as would be available to the general public. Once a preliminary external audit was completed, the auditor could be given full access to all of the information that would be available to the developers themselves, and the audit [could be] performed again." According to Schardein, that approach allows the auditor to evaluate threats from agents both external and internal to the company.
Valliere would prefer limiting access to a test system. "But in the real world," he said, "if the trust is there, access to production systems is given, but access is cancelled after the audit is finished."
In terms of code-level audits, Valliere said, "Unit testing with documentation is a must. Then system documentation and testing are required. Finally, auditing of valid source code for the production versions is needed."
What does the security report say?
If you hire a security specialist, be sure to put in writing the kind of report you expect at the end of the auditing period. According to Valliere, the report should include, at a minimum, the objectives of the audit, verification of the methods used, and specific recommendations.
"Using a Web project as an example," Valliere said, “the report should include ways to fix or improve access control, ways to prevent disasters such as lost source-code files and insecure code files, and ways to recover from disasters."
After the audit, then what?
If you spend the money to hire a security specialist to audit your systems, don't wait forever to act on the results. "An audit is only useful if management reviews and implements recommendations," Valliere said. And that review should happen promptly.
In other words, don't wait until the horse is gone to close the barn door on your applications.