Software makers should face legal action if sloppy coding leads to hackers emptying users' bank accounts, argues a Cambridge academic.
If you're poisoned by a burger you can sue the restaurant that sold it - so why can't you take a software developer to court if their negligent coding lets hackers empty your bank account?
That's the question asked by University of Cambridge security researcher Dr Richard Clayton - who is calling for software makers to be made liable for damage resulting from avoidable security flaws in their apps.
Today software generally comes with End-User License Agreements that require the user to sign away their right to sue software developers if their app contains security flaws that leaves the user's computer open to attack by malware.
Clayton is arguing for regulations that remove the developer's right to waive any responsibility for security flaws in their software. It's an argument that has already won support from officials across Europe, with a House of Lords committee recommending such a measure be implemented in 2007 and European Commissioners arguing for the requirement in 2009 - however agreements to this effect have not been passed.
"It's remarkable that of all the things that you could buy as a consumer, software is the one where you're expected to make up your mind whether it's dangerous," Clayton says.
"We've been saying for some years that what is required is to make people [developers] responsible for when they damage other people. If you went down to the corner of your street and started selling hamburgers to passers-by they can sue you [for any damage you cause]."
Clayton thinks that developers should be held accountable in cases where avoidable security holes in their software are exploited to infect a user with malware, and that user suffers some form of material loss - for instance the theft of money.
The source of the infection would be established in court with the help of subject experts and the definition of what flaws are avoidable informed by legal precedence from earlier cases, Clayton says.
"The question is 'Are they being negligent?'. The usual test is 'Are they applying contemporary standards to the quality of their work?'," he says, adding that known flaws can be exposed by running code through commonly available security tools and validation suites.
The argument for developer liability also goes beyond increased consumer protection, to providing developers with additional incentives to minimise security holes in their software.
Clayton argues that in order for any agreement on developer liability to succeed it will have to be internationally binding.
"It's not going to be easy. There's going to be a lot of moaning from everybody inside [the industry] and we're not going to do it as one country, we're going to have to do it on a global basis and over many years."
There are cases where end-users have launched legal action against IT vendors in relation to alleged security flaws in their software, and Stewart James, partner with law firm DLA Piper, said that were several such cases to be successful, it could set a precedent, increasing the chance that subsequent actions would be successful. However even if these actions were to succeed it would not guarantee greater liability by software makers for security flaws, as software developers could respond by changing the conditions of their EULAs.
James said he was sceptical about how successful any new regulations making software makers liable for damage resulting from coding flaws would be, given the number of ways that developers could shift blame to the end-user: for instance by claiming the end-user failed to follow accepted IT security practices.
"There are lots of get-outs that a software developer would look to use to defend against a claim, for example, 'Has the user updated to the latest version of software that may have closed off some of those vulnerabilities?'," he said.
Clayton and other supporters of developer liability are facing powerful opposition. Given the potential size of the liability - estimates of malware-related losses are often put at at least billions of dollars annually - the software industry is likely to lobby hard against any such measure.
Perhaps unsurprisingly the software lobby argue that its members already make their software as secure as they can, given the complexity of code underlying applications. When the matter was debated in the House of Lords in 2007, software vendors argued against it by analogy: that when a home is burgled the victim doesn't usually ask the maker of the door or window to compensate them.
Another rebuttal of liability put forward by some developers is that it would stifle innovation and interoperability between apps, as software makers would stop their apps from interacting with third party code to guard against undesirable results.
There is also the question of who is liable for flaws in open source software where there is no clear individual or group responsible for its development. When the Lords debated the matter it was argued there should be exemption for individuals who voluntarily contribute to such projects.