Image: Getty Images/iStockphoto

It’s not that developers don’t care about security–it’s that security, historically, hasn’t cared about developers. Or, rather, that security technologies like LDAP, JWT, OAuth, etc. weren’t built with developers in mind, and make it harder, rather than easier, for developers to implement strong security in the applications they build. As analyst Stephen O’Grady has written, implementing security is often seen as “merely unsexy work in a best case scenario and more likely to be both tedious and miserable.”

Compounding this fact, it’s not as if developers aren’t already busy with other demands on their time, from handling last-minute feature requests (on top of the myriad of planned features), making time for critical bug fixes, tuning performance, etc.

And yet the future of developer-driven security finally looks bright, as software vendors increasingly offer tools that cater to developers. From HashiCorp to Snyk to oso, we’re finally seeing security embrace the developer class, and it couldn’t have come at a more opportune time.

A winning formula for security

Security was already hard, and the coronavirus pandemic has made it harder. The perimeter defense IT has traditionally applied to keeping enterprise data safe was already breaking, but it became porous beyond repair once employees shifted to work-from-home arrangements en masse. From insecure network access via personal laptops to significant increases in phishing attempts, “WFH” has become a euphemism for “welcome for hackers.”

SEE: Identity theft protection policy (TechRepublic Premium)

Pair this with a world that has traditionally treated security either as a bolt-on afterthought or “something my information security team worries about,” and it’s not hard to divine why enterprise security remains a bit of a mess. Couple this with the increased expectation that developers will shoulder more responsibility for security (fiddling with VPC settings and futzing with security groups in their cloud infrastructure, for example), and the mess has the potential to get much messier.

For this reason, O’Grady’s comments, written with programming languages in mind, help to frame what needs to happen–and increasingly is happening–in security products:

[S]ecurity is generally viewed as…unsexy work in a best case scenario and more likely to be both tedious and miserable. As enterprises have shifted more and more of their business to digitally based models, however, the importance of security from an application development perspective skyrockets.

Its importance notwithstanding, security is never going to be something that a majority of developers get excited about and want to spend cycles on. If a language is able to build in some security, however, if only to protect developers from some particular subset of vulnerabilities, that is an increasingly attractive feature – particularly if it doesn’t require too many compromises.

Built-in security with few compromises…that sounds like a winning formula.

The best developer-centric security products

It is, and that movement is growing. Honeycomb cofounder and CTO Charity Majors expressed it this way: “There is a small but hardy wave of startups applying product and design sense to terrible backend problems. Building products for engineers…like they’re human.” Such “Consumer-quality developer tools,” she went on to say, acknowledge both the severity of the problem and the humanity (and time) of the people deploying the software:

I love vim. We all love our power tools. But there are steep learning curves and cognitive limits, and when you are trying to work the last thing you need is to be fishing for flags and syntax and magic options. The next wave of developer tooling will be tools that get out of your way and help you solve business problems with your whole brain, without having to dedicate your life to learning it first.

The unsexy world of security tooling just got sexy, in other words. Or can, done correctly.

SEE: Zero trust security: A cheat sheet (free PDF) (TechRepublic)

So who is doing this right? While not a comprehensive list, some of the best companies that make it relatively simple for developers to implement strong security include:

  • oso: An open source policy engine for authorization, oso represents security as code so developers can express security as a natural extension of their applications;

  • HashiCorp Vault makes it easy to secure, store, and control access to tokens, passwords, certificates, and encryption keys for protecting secrets and other sensitive data using a UI, CLI, or HTTP API;

  • Snyk enables developers to easily build security into their continuous development process, offering vulnerability scanning for open source projects and containers;

  • Sqreen: Application security with automation (Sqreen is covered in this TechRepublic article);

  • Smallstep: End-to-end workflow for single sign-on SSH; and

  • Semmle (acquired by GitHub): Code analysis platform to help teams find zero-days and automate variant analysis.

For these and other developer-oriented security products to work, they need to fit into the developer’s natural workflow (i.e., their preferred toolchain, among other things). They need great documentation, a consumer-grade experience (following Majors’ advice), open source (not always, but developers continue to love open source even in our cloud era), robust APIs, and more. If this sounds like too much, it’s not. It simply requires security to be considered as part of the application development process–that is, as part of the developer’s job–and not as an afterthought that “someone else” will handle.

It also means that selling such products involves, well, less selling. You don’t sell developer security on the golf course. You “sell” it in the docs and in the (PowerPoint-free) demos. You sell it, in other words, by simply making security a natural, built-in part of the development process; something that developers love to include. Going forward, this is what good security practice looks like.

Disclosure: I work for AWS, but these views are mine and may not reflect those of my employer.