David A. Wheeler described three necessities for developing secure software. Read about them, and ten challenges that face developers of closed source software when they try to satisfy those necessities.
According to David A. Wheeler's Secure Programming for Linux and Unix HOWTO, the three core requirements for developing secure software are as follows:
- First, people have to actually review the code.
- Second, at least some of the people developing and reviewing the code must know how to write secure programs.
- Third, once found, problems need to be fixed quickly and their fixes distributed.
Wheeler's howto is one of the best online resources for people who want to start learning the technical side of writing secure software, and these three principles are non-negotiable necessities for widely distributed, truly secure software design.
While these principles are presented as part of Wheeler's explanation for how open source software has more potential for software security than its closed source counterparts, they apply to closed source software just as much as to open source software, and there's no reason these three principles cannot be properly employed to ensure secure software development in a closed source shop too.
There are some challenges, though:
- Independent code review tends to be extremely expensive when you require nondisclosure agreements.
- Free code review tends to be scarce with "source available" licensing, because people typically feel they're giving you something for nothing, whereas open source software is in many ways its own reward.
- The more code review, the better -- and even if you get some reasonable amount of review for closed source or "source available" code, you are unlikely to get as much as you could for open source code.
- Most of the best secure code writers understand that open code is the best way to get secure code (see Kerckhoffs' principle), which might make it difficult to hire them if you plan to keep your code closed.
- Hiring and retention decisions in corporate development shops tend to ultimately rest in the hands of people who wouldn't know secure code if it bit them on their noses.
- Corporate responsibility lies with shareholder profits -- not the actual quality of software. This means that any time the ability to generate revenue or reduce costs conflicts with secure coding goals, the secure coding goals are likely to suffer. Considering the value of good programmers who know how to write secure code, that means that severely limiting the money the human resources department is authorized to offer for new hire salaries is normal behavior, which in turn limits the ability to hire the best programmers.
- Comprehensive software management systems, such as those found in open source Unix-like OSes like APT for Debian GNU/Linux and the ports system for FreeBSD don't carry closed source software anywhere near as often as open source software, and the software management systems for closed source OSes like MS Windows usually don't handle any third-party software at all anyway. This means that rapid fix distribution for closed source software usually relies on end users hunting down news of fixes, then acquiring and installing patched versions of software, themselves.
- Corporate responsibility is an important factor for patch distribution too; it is usually in a corporate vendor's best short term financial interest to downplay security vulnerabilities, which often involves deferring development of security patches (sometimes indefinitely) and even hindering the ability of users to find reliable information about vulnerabilities and fixes.
- Identification, development, and testing of security fixes depends on the availability of developers and testers. Releasing software under the terms of an open source license tends to increase the number of available developers and testers significantly.
- As David Wheeler pointed out in the Secure Programming for Linux and Unix HOWTO, you cannot very easily make changes to closed source software on the spot the way you can with open source software, given the necessary in-house expertise.
None of these disadvantages for closed source software are inflexible or absolute. There's no reason closed source software developed by a corporate vendor can't be as secure as an open source equivalent. It should be pretty obvious that, all else being equal, the trend is for circumstances to favor the security of open source software -- at least as far as these principles of software security are concerned.