Hurricane Irene is dominating the news, as people living along the East coast of the United States keep a close eye on the weather forecast and prepare for severe, possibly devastating, storms. I recently learned the same technology that is used to help predict hurricanes could be useful to IT pros.
Weather prediction is based on highly complex mathematical models involving historical data and very recent events. In addition, much of the data we actually see is just the "most likely" scenario, with the exception of hurricanes, where we see much more descriptive storm travel patterns with explicit probabilities for landfall at different locations.
Weather prediction technology uses statistical models of multivariate regression and correlation. For example, meteorologists are able to know that small barometric changes following certain patterns in correlation to certain other factors will lead to a hurricane, smaller tropical storm, or just a severe thunderstorm. Weather computers can predict air temperature several days in advance based on data gathered on that day in previous years as well as the weather that was experienced in the few days before. By combining these data points, a fairly accurate estimate of temperature, precipitation, and other factors is attainable. And, it gets better all of the time. Twenty years ago, people watched the evening news or opened the morning paper to find out what the weather might be like in the next 24 hours or so — three days tops; now, we are able to check the Internet and possibly even use a mobile app and get forecasts seven, 10, even 14 days out, and with reasonable accuracy. Perhaps if we were given those colorful hurricane probability charts for the air temperature, rainstorm chance, and other common weather patterns, we'd be more accepting of the challenge weather forecasters face.
So, how can this technology be used in IT? There are software systems available on the market that analyze server and network hardware activity, load, and performance. Many of these systems use regressive and correlative algorithms (to varying degrees of success) to help IT departments predict what could happen based on key triggers, which are in turn based on historical or immediate data — better systems will use both. Netuitive's Behavior Learning Engine uses both pieces of information to learn how your network, servers, and systems perform "normally" and then - according to a whitepaper download available at this page (which requires sign up to read) —- predict when anomalous behavior will happen before a system crashes, is completely hacked, experiences a denial of service, or any number of potentially catastrophic incidents.
This type of predictive technology has been tried and tested in many other fields as well, many with greater success than the weather-prediction industry. Some common examples are sales forecasting, drug trials, and insurance risk assessment. In these fields, statistical regression and correlation are used to determine how much product to order, how drugs will affect customers with different symptoms or health attributes, and how risky it is for an individual to own a home or drive a car, respectively.
Have you used a predictive analysis system in your company, whether for IT systems or another application? If so, what was your experience? If you haven't used one before, would you consider such a system? Share your thoughts below.