Earlier this year, I was engaged in a supply chain risk avoidance project. One solution evaluated was a third-party analytics tool that collected GPS, weather, economic, financial, and political data from around the world, and then combined it with detail related to specific suppliers, such as their individual financial health and capability, as well as their criticality with respect to the urgency of components they provided to enterprise supply chains.
Through a series of analytics algorithms, the software assessed all of this data against key metrics, and then came up with a system that red-flagged suppliers that were mission critical to an enterprise’s supply chain and that also were considered vulnerable for a number of reasons.
The goals for the enterprise were to identify its high-risk, mission-critical suppliers before they became problems, and to look for ways to lower the risk.
This is just one of many projects where organizations have been looking at big data and the power of predictive analytics to lower risks and improve performance.
In another case, the U.S. National Science Foundation (NSF) and the Japan Science and Technology Agency (JST) are collaborating on research into disaster management that utilizes big data to predict or anticipate disastrous events. In the process, they are defining systems capable of deciphering between large, noisy, and heterogeneous data with a goal of facilitating timely disaster decision making. These systems require non-linear, associative reasoning for situational analysis and response modeling, in addition to real-time data sensing, visualization, analysis, and prediction. The work is promising. For instance, will we be able to predict the next major earthquake, and take pre-occurrence measures that will preserve human life and minimize destructive impact?
While people get excited at the prospects, there are also voices and projects that warn that big data has its limits. In 1999, for instance, the Mars Climate Orbiter disintegrated in the upper atmosphere of Mars because the unit measures NASA was using were different than those defined in its analytics software.
During Hurricane Sandy, 20 million tweets helped humanitarian organizations route help to those they weren’t even aware needed help; and tweets, Facebook posts, and photos assisted aid organizations during the typhoon that hit the central Philippines, yet a 2013 UN report says that many decisions made by humanitarian professionals during disasters are not based on any kind of empirical data.
Pundits also warn government and companies of gathering data from skewed sample sizes that may not give them a large enough picture of what is really going on.
What’s the message?
“There should be no question of the incredible benefits big data has to offer,” said Big Data Insight Group, a big data forum. “It can transform any organization, regardless of its size or sector, and deliver insights which can dramatically improve the process, products and services that are crucial to the success of any business….But just because these benefits are out there…there will always be the need for experience and gut instinct.”
This is also why organizations looking to move big data into their analytics and predictive forecasting are best served by:
- Verifying that the big data samples they collect for analysis are large enough to represent the situations they are trying to assess;
- Treating their analytics systems as being predictive and not conclusive; and
- Maintaining human judgment, experience, and intuition as active and evaluative factors that these systems don’t necessarily deliver.