It's no secret that smart companies use Continuous Improvement (CI) to get better; however, really smart companies integrate big data with their CI cycles to be the best they can be.
CI is the philosophy of never-ending improvement of a product, service, or process. It has been several decades since Dr. W. Edwards Deming popularized what we know today as the Plan-Do-Study-Act (PDSA) cycle, which is the best-known methodology for implementing CI. Numerous companies have institutionalized this methodology into various areas of their business, not the least of which is their yearly planning cycle, and yet not many companies are fully leveraging the potential they have with big data when it comes to their annual improvement cycles. CI and data are a union worth exploring, so let's dive into this a bit deeper.
SEE: A better name for DevOps: 'continuous improvement' (ZDNet)
Planning with big data
In traditional planning, goals are usually set based on historical performance; planning with big data is mainly comprised of setting goals and developing a plan of action.
Practitioners of big data analytics are capable of understanding the company's true potential or "theoretical best." Without the large volume of available data and sophisticated analytic techniques used by savvy big data practitioners, traditional planning is typically reduced to educated guesses. Furthermore, activities and schedules are much more accurate when big data is applied.
Harvesting and analyzing operational data provides great benefits, including far more accurate duration estimates, better identification and mitigation of risks, and a much more reasonable allocation of resources. This in turn sets the stage for a successful execution.
Doing with big data
But even with the best plans, execution gets out of control sometimes, which is why big data analytics is critical during the "do" phase of a company's annual improvement cycle. This is where big data's ability to handle the velocity of data produced becomes extremely valuable. Most companies produce a massive amount of operational data at a blinding pace—regardless of the industry or the nature of its products and services. When I worked with PayPal, we were looking at clickstream data that was constantly being generated at a furious pace. Without the right technology and tools, it would be impossible to keep up. But keeping a close eye on all this data in real time is necessary for making sure your plan stays on track. Big data analytics provides both timely alerts for real-time corrections and a wealth of information to study for the next phase.
Studying with big data
At some point, typically around the third fiscal quarter, it's time to analyze this valuable store of operational data. The goal in the "study" phase of CI is to identify problem areas and possible areas of improvement. Big data practitioners have an advantage over their more traditional counterparts when it comes to this phase.
Most companies are limited in their ability to uncover opportunities and must settle with only the most obvious ones. Conversely, data scientists who conduct exploratory data analysis (EDA) often find gems in their operational data that nobody would suspect; these hidden gems might be the difference between an incremental change and a breakthrough for the company.
Acting with big data
Any hidden treasure discovered through EDA must be further analyzed and integrated, which is how the "act" phase looks for those who employ big data analytics. This is far superior to what is traditionally done in the "act" phase. In fact, many companies misunderstand what this phase is intended to accomplish and simply adjust their plans based on their studies from the previous phase.
This phase is not about replanning and reimplementing—it's about incorporating what was learned to feed the next planning cycle. For big data companies, this means performing quantitative analysis on the qualitative findings from the EDA. That way when the annual cycle completes, and the company enters the planning phase again, it starts on a solid foundation that's supported by data science.
SEE: Job description: Data scientist (Tech Pro Research)
I previously defined big data in competitive terms as such: Big data is the massive amount of rapidly moving and freely available data that potentially serves a valuable and unique need in the marketplace, but is extremely expensive and difficult to mine by traditional means. Savvy companies with the analytic prowess to process and analyze this data have the potential of making exponential improvements instead of incremental ones.
Take the time today to bring together your CI program and your data science program. If you're serious about getting better, that's the only way to go.
- Why CEOs must lead big data initiatives (TechRepublic)
- Under pressure: 4 main stressors for big data leaders (TechRepublic)
- 5 ethics principles big data analysts must follow (TechRepublic)
- 6 big data trends to watch in 2017 (TechRepublic)
John Weathington is President and CEO of Excellent Management Systems, Inc., a management consultancy that helps executives turn chaotic information into profitable wisdom.