There are a lot of bad words in IT consulting and management. Process is one of them. Process improvement sounds even worse; just mentioning it around experienced IT people can cause them to blanch and run for the hills. I learned how these perfectly benign words earned their negative reputation after one particularly earnest but unwise effort on my part.
I had just come off a handful of reasonably successful operations upgrade projects. My next assignment looked like a relatively simple network upgrade, with a corresponding upgrade in some application servers. The development team wanted to upgrade their rather heavily modified SAP systems at the same time. Although I'm always leery of changing too much, too fast, it didn't seem like the addition would present a problem.
Three weeks into the project, the development manager pulled me aside. He needed to implement some QA elements in his development cycle and wanted my advice. I suggested a simple multistage approval process, in which the developer submitted the finished code to a QA person, who verified that it worked and passed it up the chain of command for final approval. I diagrammed a few examples on a whiteboard. He thanked me and left.
Four months later, one of my team leaders sat down on my desk. He asked me point-blank if I knew what was going on with the development process. Feeling fairly confident, I said yes. Why then, he questioned, was every single piece of code taking the servers down or wiping out some vital function?
On closer questioning, it turned out that my team leader's group covered up the delivery of shoddy code for two months before coming to me. They didn't want to raise a fuss, given the massive amount of work that I had on my plate. I pointed out, somewhat huffily, that problems like this one had a higher priority than synchronizing various schedules. I promised to get to the bottom of whatever was going on.
The next day, I invited myself to one of their QA approval meetings. At the meeting, the programmers presented the QA tech with their approval slips. The QA tech signed them without reviewing or testing the code, then passed them down the table to the waiting managers. When the sixth signature graced the page, the blessed piece of code was officially approved for transport.
What went wrong?
Other than providing a fascinating example of ritual behavior in modern business, what did the meeting accomplish? How could what looked like simple processes go so horribly wrong? Determined to get to the bottom of the problem, I pulled aside the development manager after the meeting.
Although he looked uncomfortable, he looked me straight in the eye when he said, "Look, no one cares about this. It's just important that we get though it so that the auditors leave us alone."
I wanted to strangle him. We had talked about all of the advantages a working QA program brought to the table.
On deeper reflection, I realized that I made a fundamental error in my initial communications with my counterpart. When I suggested the multitiered model I'd just witnessed in action, I was thinking in terms of raising quality awareness to the highest levels of the organization. My client, on the other hand, just needed to overcome a potential hurdle on his way to getting code out the door. Although he installed the process I suggested, it didn't meet his needs. In fact, it caused a huge amount of resentment among all of the parties forced to participate.
This brought the variability of process utilization to my attention. Although we in the business prefer to think that there are right and wrong ways to use process, reality tends to point out that process is just one more communications tool. Businesses use it to appear to be doing the right things, while continuing to do what they feel will produce the best results. If the process doesn't provide people with something they need, it becomes an empty ritual to be performed if and when time allows.
After bending my personal feelings about process to fit the above statements, I faced the next and even more difficult question: How could I advise my clients to produce workable processes? Not processes that looked good on paper or aligned with some greater methodology, but processes that they would follow and derive value from?
My answer came from the comparison between what I thought my client wanted and what he was truly looking for. In essence, we were measuring different things. I thought he wanted a way to demonstrate the quality of his team's code, a measurement that could later lead to ways of showing improvement. He was looking for a single, simple binary measurement of compliance to avoid later political complications. The eventual implementation matched his needs in function while following my suggested form.
This suggested that I could help to create useable procedures and processes by asking the following questions:
- What does the procedure executor need to measure to act for further action?
- What does the process monitor need to measure for further action?
- What does the manager need to measure for further action?
By answering these questions, I discovered that I could more accurately create procedures and processes my clients would follow. The results of their activities always produced some kind of information they could take action on, spurring further growth. Furthermore, this method gave me access to a powerful change tool that had previously eluded me: By helping people to come up with more and more progressive measurements, I enabled them to control change in their environment.
A few years later, I was working with a company that wanted to outsource the vast majority of its help desk and operations activities. It had already failed to successfully execute the project with one of our competitors, but still believed in the fundamental power of the business model.
After discussing the engagement with both the leaving project team and our client, I realized that for all of their process work, they were still considering procedures and processes from a task management standpoint. Every procedure they created tried to address multiple unspoken communication and measurement needs. This led to a deep confusion about what was really important, which eventually caused the project's collapse.
Rather than repeat our competitor's mistake, I suggested that perhaps we should work together to establish clear measurements and methods to gather them. In two months, I moved on from the project, confident that my iterative design would continue. In six months, the client had successfully transferred their help desk and operations to my company. In a year, we were exceeding their expectations to such a degree that they invited us to participate in a number of lucrative development and deployment projects.