WhiteHat Security recently published its 2012 Website Statistics Report (PDF), which gives a broad view of the state of website security. I spoke with Jerry Hoff, WhiteHat Security's VP of Static Code Analysis, to get an idea of how bad the problems with website security are, why we still have these problems, and what software developers can do to remedy them. Sadly, despite much improvement from previous years, there is little cause for celebration.
The report shows a dramatic decline in overall serious vulnerabilities, from 1,111 in 2007 to 230 in 2010, and now "only" 79 in 2011. While that's a fantastic drop as a percentage, the fact that the average website has 79 vulnerabilities is troubling. Even more bothersome are the types of vulnerabilities as of 2011 (in parenthesis are the percentage of sites showing these vulnerabilities): Cross-Site Scripting (55%), Cross-Site Request Forgery (19%), and the most dangerous (in my opinion) of the bunch, SQL Injection (11%). While this is a slice of the vulnerabilities found, these particular issues irk me to no end, because they should not exist in 2012; proper Web frameworks, tools, and libraries do a great job of mitigating these problems. When I see an application or website with these problems, I know there is a developer behind the scenes hand-coding things, usually for no good reason, and clearly without the proper skills or knowledge to be hand-coding this kind of stuff. I find SQL Injection vulnerabilities particularly offensive, because in this day and age, it is substantially more work to create code that is vulnerable to them than code that is protected.
Mr. Hoff and I talked quite a bit about the underlying reasons for these problems. Something that we definitely agreed on is that a lot of developers do not know how to address security concerns. There is really nothing new here, we've been seeing this problem for a long time and just throwing more and more technology at the problem to compensate. For example, in the WhiteHat Security report, 71% of the found vulnerabilities can be plugged with an application firewall. But think about that for a moment... 71% of the vulnerabilities are so commonplace that an automated rules-based tool can zap them. This tells me that right off the bat, 71% of the problems should not exist in the first place.
The heart of the problem is that the amount of knowledge needed to be a "competent software developer" continues to increase, while the useful lifetime of the knowledge continues to decrease. There are simply not enough hours in the day for someone to work a full-time job as a software developer and get the continuing education needed to be a competent software developer unless they are willing to sacrifice an awful lot of their personal time or work for one of the rare companies that truly supports learning. To make it even worse, being a good software developer requires at least 5 - 10 years of experience (depending on what you are doing and the environment you are doing it in) on top of initial foundational learning. So just as developers truly hit their stride, they are in prime territory for life events like marriage, children, and generally slowing down the pace of their lives. I can tell you from personal experience that in the last six years since I started writing for TechRepublic, I went from "single, childless adult with nothing to do after work except play video games and do more programming" to "married adult with two children who has watched two films in the last year that were not suitable for a child."
Several weeks ago, Dr. Dobb's released a fantastic salary survey. Skip ahead to page 9 for the worst kick in the head that you'll get all week. What the chart shows is that the average IT worker's salary peaks between 36 and 55 and decreases from there. In addition, the average salary for the 36 - 45 band and 46 - 55 band are identical. In other words, there is a massive 20 year career period where someone's value does not increase. In comparison, IT managers show a curve more like you would expect, where the salary continues to increase until the 46 - 55 band, and then shows a drop off.
There is no coincidence here. How do I know? Page 10 is the second worst kick in the head of the week: the breakout of IT workers by age. We see a decline from 35% of IT workers in the 46 - 55 age bracket to 14% in the 55+ age bracket. It is clear that developers have a shelf life, and the expiration date is around 55.
All of the education in the world is not worth a hill of beans unless you can internalize it, and that requires a lot of hard real-world lessons, the kind that only those who have a lot of experience have learned. My overall level of "in-demand skills" (i.e., the kind that you'd talk about in a job interview or put on a resume) has not gone up in the last five years and if anything it has gone down, but my base of foundational knowledge has continued to grow. The Justin James of today knows an awful lot more about safe, sane coding practices than the Justin James of 2005 when I started writing for TechRepublic, or the Justin James of 2000 when I graduated college, or the Justin James before that. In 2000, I knew a lot more than the typical Web developer, because I had started in 1995 when Perl was just starting to become the hot tech to make Web pages interactive and dynamic. By 2001, I had implemented significant parts of the HTTP protocol by hand and from scratch, and in the process and I learned a l lot about how web security works.
This brings me back to the security issue. Without a doubt, if I didn't have the nearly 20 years of experience I have had writing software, I would be churning out code full of holes. I know this, because I've seen too many less experienced developers make the same class of mistakes over and over and over again that I was making in 1997 or 2003, even when I "knew better." I know why I made those mistakes — it was either the hubris of "I can roll my own better than off-the-shelf," or the idea that slapping something together quickly would be fine "for now" and I would pay the technical debt off later. I was wrong on both counts, every single time.
What I've learned since then, when it comes to security, is the following (hardly conclusive, and in no particular order):
- Sanitize EVERYTHING, including data pulled from databases, files, Web services, or anything else that is not a literal expression within your code.
- Use pre-made tools, frameworks, libraries, code components, etc. to take the burden off your shoulders. It is better to re-work your business logic to use a pre-tested piece of code than to roll-your-own and be stuck trying to get the security kinks worked out.
- Do not box yourself into a corner with integrations so that a base system is "upgrade proof" or even "upgrade resistant." Applications get critical security patches all the time; if you cannot do what you need or want to do to a system without preventing the base system from being patched or upgraded, either you need to re-evaluate your needs or you need to use a different system. End of story.
- Never generate SQL statements within an application. If you insist on writing your own SQL (hint: you probably can't write it any better than a code generator, unless the task is unusual or you are a SQL wizard, and most tasks are not that unusual and you probably aren't a SQL wizard), put it in stored procedures. If I ran the world, compilers would flag inline SQL as an error and refuse to compile.
- Use the various automated tools out there to perform periodic security audits.
- Treat security fixes as your most important features.
- Approach your code with the mindset of an attacker. If you wanted to vandalize the system, destroy data, or implant little bombs for users, what windows are wide open and which doors are unlocked?
- Keep learning, keep working, keep studying, and keep talking to people. Get exposure to different ideas. Find out how other people are solving problems, and investigate them!
Justin James is an OutSystems MVP, architect, and developer with expertise in SaaS applications and enterprise applications.