...this concept of large-scale data collection and analysis. In the Hudson, 98 per cent of the changes in water quality happen only two per cent of the time.
That statistic is a classic example of the Pareto Principle, more commonly known as the 80-20 rule. But to know when the two per cent changes are occurring, you need to be sampling data 100 per cent of the time.
How many data points do you have in the entire supply chain of your business? Or do you rely on spreadsheets full of guesstimates and management judgements made on intuition, rather than data?
It's clear that programmes of data collection and real-time analysis, such as the Beacon Institute work on the Hudson, will apply across the wider corporate world. And that's not just to say we should be exploring more data warehousing and business intelligence. That's only the first step.
Executives are going to start demanding information that allows fact-based decision-making and it will be the IT department or supplier that is called on to find where that information exists.
But there will always be those who think they know better and such people are often the blockers of innovation. Cronin has done extensive research into predicting water flows using radar, so he volunteered his services to the US government when the recent BP Gulf of Mexico oil spill took place.
He estimated it would take four weeks to get the equipment set up all over the Gulf, but it would have provided an extremely accurate source of information on where the oil spill would go - in real-time.
The government said four weeks was too long. More than three months later it looks as if BP has successfully capped the leak, but who knows where the oil that has already leaked is heading?
Is your organisation thinking about data analytics in the same way as the US government?
Mark Kobayashi-Hillary is the author of Who Moved my Job? and Global Services. He lectures at London South Bank University.