Should you change your passwords often? What’s the risk if you don’t? Little did I know, listening to one podcast would cause me to rethink how I would answer those questions.
—————————————————————————————
I now understand why my friend insisted I listen to Episode 229 of the Security Now series. He wanted to introduce me to Cormac Herley, Principle Researcher at Microsoft and his paper, “So Long, and No Thanks for the Externalities: The Rational Rejection of Security Advice by Users.”
Dr. Herley introduced the paper this past September at the New Security Paradigms Workshop, a fitting venue. See if you agree after reading the group’s mandate:
“NSPW’s focus is on work that challenges the dominant approaches and perspectives in computer security. In the past, such challenges have taken the form of critiques of existing practice as well as novel, sometimes controversial, and often immature approaches to defending computer systems.
By providing a forum for important security research that isn’t suitable for mainstream security venues, NSPW aims to foster paradigm shifts in information security.”
Herley’s paper is of special interest to the group. Not only does it meet one of NSPW’s tenets of being outside the mainstream. It forces a rethink of what’s important when it comes to computer security.
Radical thinking
To get an idea of what the paper is about, here’s a quote from the introduction:
“We argue that users’ rejection of the security advice they receive is entirely rational from an economic perspective. The advice offers to shield them from the direct costs of attacks, but burdens them with far greater indirect costs in the form of effort. Looking at various examples of security advice we find that the advice is complex and growing, but the benefit is largely speculative or moot.”
The above diagram (courtesy of Cormac Herley) shows what he considers as direct and indirect costs. So, is Herley saying that heeding advice about computer security is not worth it? Let’s find out.
Who’s right
Researchers have different ideas as to why people fail to use security measures. Some feel that regardless of what happens, users will only do the minimum required. Others believe security tasks are rejected because users consider them to be a pain. A third group maintains user education is not working.
Herley offers a different viewpoint. He contends that user rejection of security advice is based entirely on the economics of the process. He offers the following as reasons why:
- Users understand, there is no assurance that heeding advice will protect them from attacks.
- Users also know that each additional security measure adds cost.
- Users perceive attacks to be rare. Not so with security advice; it’s a constant burden, thus costs more than an actual attack.
To explain
As I read the paper, I sensed Herley was coaxing me to stop thinking like an IT professional and start thinking like a mainstream user. That way, I would understand the following:
- The sheer volume of advice is overwhelming. There is no way to keep up with it. Besides that, the advice is fluid. What’s right one day may not be the next. I agree, this link is to US-CERT security bulletins for just the week of March 1, 2010.
- The typical user does not always see benefit from heeding security advice. I once again agree. Try to explain to someone who had a password stolen by a key logger, why a strong password is important.
- The benefit of heeding security advice is speculative. I checked and could not find significant data on the number and severity of attacks users encounter. Let alone, data quantifying positive feedback from following security advice.
Cost versus benefit
I wasn’t making the connection between cost-benefit trade-offs and IT security. My son, an astute business-type, had to explain that costs and benefits do not always directly refer to financial gains or losses. After hearing that, things started making sense. One such cost analysis was described by Steve Gibson in the podcast.
Gibson simply asked, how often do you require passwords to be changed? I asked several system administrators what time frame they used, most responded once a month. Using Herley’s logic, that means an attacker potentially has a whole month to use the password.
So, is the cost of having users struggle with new password every month beneficial? Before you answer, you may also want to think about bad practices users implement because of the frequent-change policy:
- By the time a user is comfortable with a password, it’s time to change. So, users opt to write passwords down. That’s another whole debate; ask Bruce Schneier.
- Users know how many passwords the system remembers and cycle through that amount, which allows them to keep using the same one.
Is anything truly gained by having passwords changed often? The only benefit I see is if the attacker does not use the password within the password-refresh time limit. What’s your opinion? Is changing passwords monthly, a benefit or a cost?
Dr. Herley does an in-depth cost-benefit analysis in three specific areas, password rules, phishing URLs, and SSL certificate errors. I would like to spend some time with each.
Password rules
Password rules place the entire burden on the user. So, they understand the cost from having to abide by the following rules:
- Length
- Composition (e.g. digits, special characters)
- Non-dictionary words (in any language).
- Don’t write it down
- Don’t share it with anyone
- Change it often
- Don’t re-use passwords across sites
The report proceeds to explain how each rule is not really helpful. For example, the first three rules are not important, as most applications and Web sites have a lock out rule that restricts access after so many tries. I already touched on why “Change it often” is not considered helpful.
All said and done, users know that strictly observing the above rules is no guarantee of being safe from exploits. That makes it difficult for them to justify the additional effort and associated cost.
Phishing URLs
Trying to explain URL spoofing to users is complicated. Besides, by the time you get through half of all possible iterations, most users are not listening. For example, the following slide (courtesy of Cormac Herley) lists some spoofed URLs for PayPal:
To reduce cost to users, Herley wants to turn this around. He explains that users need to know when the URL is good, not bad:
“The main difficulty in teaching users to read URLs is that in certain cases this allows users to know when something is bad, but it never gives a guarantee that something is good. Thus the advice cannot be exhaustive and is full of exceptions.”
Certificate errors
For the most part, people understand SSL, the significance of https, and are willing to put up with the additional burden to keep their personal and financial information safe. Certificate errors are a different matter. Users do not understand their significance and for the most part ignore them.
I’m as guilty as the next when it comes to certificate warnings. I feel like I’m taking a chance, yet what other options are available? After reading the report, I am not as concerned. Why, statistics show that virtually all certificate errors are false positives.
The report also reflects the irony of thinking that ignored certificate warnings will lead to problems. Typically, bad guys do not use SSL on their phishing sites and if they do, they are going to make sure their certificates work, not wanting to bring any undue attention to their exploit. Herley states it this way:
“Even if 100% of certificate errors are false positives it does not mean that we can dispense with certificates. However, it does mean that for users the idea that certificate errors are a useful tool in protecting them from harm is entirely abstract and not evidence-based. The effort we ask of them is real, while the harm we warn them of is theoretical.”
Outside the box
There you have it. Is that radical-enough thinking for you? It is for me. That said, Dr. Herley offers the following advice:
“We do not wish to give the impression that all security advice is counter-productive. In fact, we believe our conclusions are encouraging rather than discouraging. We have argued that the cost-benefit trade off for most security advice is simply unfavorable: users are offered too little benefit for too much cost.
Better advice might produce a different outcome. This is better than the alternative hypothesis that users are irrational. This suggests that security advice that has compelling cost-benefit trade off has real chance of user adoption. However, the costs and benefits have to be those the user cares about, not those we think the user ought to care about. “
Herley offers the following advice to help us get out of this mess:
- We need an estimate of the victimization rate for any exploit when designing appropriate security advice. Without this we end up doing worst-case risk analysis.
- User education is a cost borne by the whole population, while offering benefit only to the fraction that fall victim. Thus the cost of any security advice should be in proportion to the victimization rate.
- Retiring advice that is no longer compelling is necessary. Many of the instructions with which we burden users do little to address the current harms that they face.
- We must prioritize advice. In trying to defend everything we end up defending nothing. When we provide long lists of unordered advice we abdicate all opportunity to have influence and abandon users to fend for themselves.
- We must respect users’ time and effort. Viewing the user’s time as worth $2.6 billion an hour is a better starting point than valuing it at zero.
Final thoughts
The big picture idea I am taking away from Dr. Herley’s paper is that users have never been offered security. All the advice, policies, directives, and what not offered in the name of IT security only promotes reduced risk. Could changing that be the paradigm shift needed to get information security on track?
I want to thank Dr. Cormac Herley for his thought-provoking paper and e-mail conversation.