Why does process improvement fail?

It's clear, both anecdotally and objectively, that process improvement efforts have failed far more often than they have succeeded. This blog talks about why this is true.

One of the most famous cartoon quotes of all time is Walt Kelly's "We have met the enemy and he is us." It was first used as a poster for Earth Day 1970, and then later developed by Kelly into a two-panel cartoon. The cartoon's sentiment, while specifically applied to a decidedly environmental message, can be applied to just about any human endeavor.A great deal of effort and resources have been invested in process improvement—whether TQM, CMMI, Six Sigma, ISO 9001, etc—throughout the years. However, it's clear, both anecdotally and objectively, that process improvement efforts have failed far more often than they have succeeded and it's our own faults.

First the statistics

The Juran Institute has estimated that the cumulative cost of mistakes, rework, and scrap will add up to 25-40% of a business' total expenses. The 2004 Standish Group's "Chaos Report," a biannual study based (to date) on surveys of more than 50,000 Information Technology (IT) projects, estimates that only 29% of all software projects succeed, with 53 percent of all projects failing to attain their specified cost, schedule, or performance goals. An additional 18 percent are cancelled prior to completion or delivered and never used. That's an overall failure rate of 71%.

When applied to process improvement, Pareto's principle—more commonly known as the 80-20 Rule—states that 80% of your effort will only produce 20% of your benefit.

There are those who may object to the union of these three approximations of reality. In particular, for some, applying the findings of IT projects performance to process improvement performance may prove particularly galling. There is nothing, however, to suggest that any process improvement project is more complex or difficult than any IT project. Process improvement efforts frequently have a substantive IT component. And more important, both exist within the same environmental and managerial milieu.

To my knowledge there have been no attempts to quantify process improvement success rates. If nothing else, linking the three statistical assertions to the others is as good a starting point as any other, and perhaps better.

The principle of suboptimization asserts that optimizing each subsystem independently, in general, will not lead to an increased optimization for the overall system. The act of subsystem improvement frequently causes the exact opposite of the intended outcome.

In other words, the whole is regularly less than the sum of its parts.

At the heart of suboptimization are some serious operational blind spots:

  • Ignoring the cumulative entropy created by the interaction of the various subsystems with one another.
  • Confusing the maximization of the output of the various subsystems as being synonymous with maximizing the final output of the overall system.
  • Assuming that the final outputs will achieve the targeted goals and/or outcomes.
  • Failing to validate that the targeted goals are actually moving toward the overall organizational vision.

A 1967 NASA report once noted:

"To avoid suboptimization, it is necessary to develop the design criteria logically from the overall system requirements, always keeping the maximum-value goal in mind."

All the engineering mumbo-jumbo aside, the message is: all too often, dependencies between the subsystems are not well understood.

Let's return to the 80-20 Rule. Process improvement has largely been marketed, as well as procured, as an enterprise-wide solution. Mostly, this approach has been a reaction to the serious flaws of IT management in the past, where the organization inadvertently adopted a flawed or incomplete re-engineering strategy, and too little attention was given to the elicitation and validation of requirements. However, the enterprise-wide approach has some serious flaws: it assumes that everything that is planned can be realistically achieved.

Not all processes are appropriate targets for process improvement, because the cost of the improvement may be more expensive than the increased productivity they generate. Moreover, the 80 -20 Rule is, in effect, an inefficiency constant. It tells us that an across-the-board enterprise process improvement approach will in all likelihood result in failure. If 80% of your effort results in only 20% benefit, then that only leaves the remaining 20% of your effort for the remaining 80% you seek. You'd find better odds in Las Vegas.

Using the 80-20 Rule as a starting point, we can assert that 20% of a company's processes produce 80% of its waste and rework. Jay Arthur of LifeStar has suggested a different, more focused approach. He points out that the 80-20 Rule can be applied to itself (i.e., 20 percent of 20% and 80 percent of 80%), resulting in 4 percent of a company's processes produce 64% of its waste and rework. And applying this approach one level further results in a rounded outcome of: 1% of a company's processes produce 50% of its waste and rework.

In fact, process improvement efforts often trigger increased internal competition for scarcer operating resources, which in turn may act as a catalyst for unexpected personal, business unit, and cultural conflict. It may also produce unexpected bottle-necks in organizational decision-making. These unexpected consequences of process improvement implementation must be dealt with by an organization that is already stretched thin trying to change while still maintaining the functioning of day-to-day responsibilities.

Yet business executives and senior managers continue to pursue the "heroic" approach to process improvement. Why? There is, I'm afraid, no good answer.

Editor's Picks