One of the great and often ignored challenges of Big Data is whether or not historical data is actually relevant to answering a given question. Anyone who has ever glanced at the fine print of their stockbroker’s web page or an investment prospectus has likely noticed the old quip that “Past performance does not guarantee future results.” Despite a century of increasingly complex analytics and reporting, this disclaimer has largely held true; otherwise we’d all happily pick yesterday’s market winner, then begin researching which island to buy as the returns poured in.

While this is common sense for the average investor, in IT we occasionally forget the perils of past performance. With Big Data especially, pundits and vendors imply that if we throw bigger and better data at a faster platform, we’ll eventually be able to predict the future with near certainty.

A crisis of philosophy?

It’s rare that we get to talk about a topic like philosophy when dealing with data and technology management, but discussions around Big Data should include pause to consider IT and the larger organizations’ philosophies around data. For IT especially, we are used to solving technical problems that usually have a defined answer. With the right resources, we’re able to overcome most technical challenges or eventually discover that the cost of those resources is simply not economical. With Big Data, we tend to apply this veneer of certainty to what amounts to predicting the future.

Superficially, Big Data looks like any other technical problem. There are IT resources to be marshaled, plans to be created and executed, development work to be done, and people to be managed. It’s easy to assume there’s a “right” answer to Big Data and to put an unflinching faith in the ability of the system to do what amounts to predicting the future.

Big Data can also put your focus in the wrong place. Most of the events that shape the future are external: a competitor may enter a market or launch a new product, an economic or social calamity may reshape markets, or anything from a war, demographic shift, or alien sighting could change the future overnight. Tools like Big Data focus on past performance. Even with Big Data’s promise of near real-time analysis, you’re still working with “old” information, even if it is only microseconds old.

Keeping focused

Part of the fervor over Big Data is similar to the latest “surefire” investment scheme: we all want an ability to peer into the future, and any individual or organization that can predict the future will obviously have a dramatic leg up on the competition. However, it’s incumbent on IT leaders to temper some of the enthusiasm around Big Data and ensure that it’s presented as a useful and effective tool for understanding past performance, but one that cannot predict future results. With any new reporting tool, it’s tempting to put faith in the technology and focus inward rather than keep a metaphorical finger in the wind, attempting to spot the next shift. As one of the more rational and technologically inclined groups in most companies, IT leadership is well positioned to provide this organizational gut check.

As your history teacher likely admonished, learning from the past is critical to understanding the future, but wars end, economies change, and demographics shift. Just as we need historians and prognosticators, temper your Big Data initiatives with one eye firmly focused on the future.