Security

Peter Cochrane's Blog: Warped perceptions

Real threats ignored, risks hyped up…

Written at Schiphol Airport in the Netherlands and dispatched via an airline lounge wi-fi service.

Some years ago I was struck by the amount of time people spent at my local supermarket trying to decide which pack of bacon to buy. Since the packs were almost identical, any time spent on deliberation must have been wasted.

My interest turned to fruit, vegetables and other goods. It seemed universal: an inverse relationship exists between time expended and item value. And this phenomenon even extends to electronic goods, furniture, cars and homes.

Exclusive Special Report: CIO Agenda 2008

Find out what's hot on the top tech execs' agendas for 2008…

Video: CIO Agenda 2008

Naked CIO: The true cost of IT

Why IT must escape the belt-tightening

Cost-cutting tops CIO priorities

Recession fears hit IT budgets

What governance can really mean to business

The CIO shopping list

This caused me to think about security and I drew up the following rules:

  1. Resources are deployed in inverse proportion to actual risk.
  2. Perceived risk never equals actual risk.
  3. Security people are never their own customer.
  4. Cracking systems is 100 times more fun than defending them.
  5. Security standards are an oxymoron.
  6. There is always a threat.
  7. The biggest threat always comes from the direction you're least expecting.
  8. You need two security departments - one to defend and one to attack.
  9. People, irrationally, expect 100 per cent electronic security.
  10. Nothing is 100 per cent secure.
  11. Security and operational requirements are mutually exclusive.
  12. Hackers are smarter and younger than you.
  13. Legislation and management thinking always lag years behind threats.
  14. As life becomes faster everything becomes less secure.
  15. People are the number one risk factor. Machines may seem perverse but they aren't devious - yet.

The big problem is that this is mostly conjecture. But I recently turned my attention to the media and scare stories, where there is a huge amount of data.

Ask anybody, we all know terrorism is a big deal. So let's just do a broad-brush comparison based on reported death rates and see what the risk really looks like.

Terrorists kill fewer people per year than:

  • Road deaths or medical malpractice worldwide per day.
  • Falling down stairs, crossing the road or animal attacks per year.
  • DIY, maternity problems, HIV/Aids, infected water supplies per year.
  • Adverse drug reactions per year.
  • Almost any single major natural disaster befalling humanity per year.

Surprised yet? To get a feel for the extent of the reality-skewing that goes on it is also worth looking at specific disaster reports and media predictions:

On 26 April 1986 the Chernobyl reactor exploded. Initial media reports for short-term deaths ranged from 2,000 to 30,000 individuals. But in 2005 a UN report found the actual number to be fewer than 50 deaths. That is quite an error, spanning 40:1 to 600:1 respectively

The long-term prognosis was even worse, with total deaths forecast ranging between 150,000 and 3,500,000 individuals. Again, the UN Commission findings of 2005 were in stark contrast at an estimated 4,000 total. This time the error spans 37:1 to 870:1.

I could cite many more. The examples seem endless: Y2K, the coming ice age, various pandemics, and so on. But there appears to be a consistent story, and gross errors are the norm. And I mean orders of magnitude not factors of just two or three. What is happening?

In a world that is increasingly connected, with abundant computing power able to model almost all situations and events, this seems to be quite a paradox. Not so.

The reality is our media and political establishments are more or less bereft of any machine-based support, unless that is, we are talking sporting events and elections.

So in a quirky twist it seems our best computing capabilities are used to track and predict the outcomes of sports and political battles - and predict stock prices, of course. But we don't seem to employ any significant resources when allocating national budgets to society's serious problems.

There is probably another vital driver - the need for news and, better, news that is sensational. The media is fuelled by advertising that demands eyes and ears, so no news ultimately means no money. Yep, I reckon good old-fashioned hype is in there biasing the perception of public and politician alike.

Clearly, the lack of a clear picture is leading to budget waste across the board. This is perhaps the best example of simple-minded reactionary management - pronounced ignorance - always being very expensive.

If our societies are not to continue wasting billions on non-threats and non-problems we are going to have to get far more professional.

Our only solution is to use our technology to gather accurate information and model the impact on society relative to all other risks and events. Not to do so will result in escalating waste as the number of potential threats will most likely continue to rise.

Of course there is always the argument that if we had done nothing, then it would have been much worse. But that is exactly my point - we don't actually know. Certainly, we model and plan wars well. But if only we could do the same for peace.

The resources and skills required are the same - but it seems we choose not to invest in getting the really important things right.

About Peter Cochrane

Peter Cochrane is an engineer, scientist, entrepreneur, futurist and consultant. He is the former CTO and head of research at BT, with a career in telecoms and IT spanning more than 40 years.

Editor's Picks

Free Newsletters, In your Inbox