Tech & Work

Tech Tip: Can you learn security in the classroom?

Find out if you can you learn security in the classroom.

By Jonathan Yarden

Writing reliable and secure software is a process that's generally easier to talk about than accomplish. And while flaws that result in exploitable vulnerabilities are rare, finding serious defects in popular software is cause for concern.

Commercial programmers are notorious for Band-Aid coding to fix obvious software design flaws. However, this compounds security problems rather than fixing them. Most of the time, it's not even the programmers or software testers who find the flaw—it's the person who writes the exploits.

The majority of software defects are the result of economics and inexperienced programmers. Add the "rapid application development" culture that produced programs such as Visual Basic, which promotes speed over reliability, and you have a recipe for disaster. Designing reliable and secure software requires a different way of thinking.

In an attempt to create a solution, Microsoft funds university classes that train students how to identify potential vulnerabilities in software. Don't make me laugh. It's true that better code comes from better programmers, but we're talking about a company known for many vulnerabilities and security holes that's training people how to find them.

A general computer science curriculum should already address proper programming and debugging techniques—we shouldn't relegate this important topic to a specialty class. But even then, I remain skeptical whether integrating safe coding practices and debugging skills into existing computer science and programming classes leads to better software. Only a few rare students will actually understand how to apply these skills when creating and testing software code.

Furthermore, university computing courses don't provide the kind of training you get from being an experienced programmer. In fact, the best programmers have a keen interest for computing that goes far beyond the strict computer science approach taken at most universities.

As I see it, great computer programmers have more in common with artists than scientists. And if reliable, secure programming is artwork, then debugging is magic. People who find vulnerabilities and use abnormal program failures to write malicious code exploits are usually self-taught rather than formally educated. They're hard-wired to be code artists, long before they attend computer science classes.

While I applaud Microsoft for its support of higher education, its efforts will do little to improve software security in general. The real goal should be to design applications around security, not improve the security of programs that are already full of holes.

Jonathan Yarden is the senior UNIX system administrator, network security manager, and senior software architect for a regional ISP.

Editor's Picks

Free Newsletters, In your Inbox