Are you tired of drawn-out discussions putting the development of your next great analytic solution in jeopardy? Don't feel alone.I often speak and write about the perils of the path to perfection. It's an extremely important concept for analytic leaders to understand because anyone who works with analytics inherits this risk. The degree to which they scrutinize everything is both a blessing and curse. It's a blessing when this level of scrutiny is warranted—as in the design of a life support system—because they're the only ones who are capable of thinking at this level. However, it's a curse when heavy scrutiny is not warranted—as in making accurate predictions.
The truth is, analytic-minded people have a hard time knowing what success looks like because the line that separates success from perfection is imperceptible for them. As a leader, you should have a much easier time seeing this line. However, you might not feel you have the technical breadth to relay this to your team of data scientists. So you trust they'll make the right decision to remain on the road to success and avoid the path to perfection—but they can't. This is why your product development is taking much longer than it should. Without good operational definitions in place you're taking huge risks in the development of your next analytic solution.
Operational definition defined
An operational definition is a crystal clear, unambiguous, detailed description of a characteristic or attribute, like "late," "clean," or "good." In our case, we'll use operational definitions to define success. You need a clear operational definition of what success means so that your data science team knows when to stop thinking about this and move onto that. Without it, your data scientists are prone to stay on that path to perfection until your product development budget is bled dry.
I'm working with a global oil and gas company to help them improve their risk-based inspection capabilities and standardize their practices across all refineries. At the 50,000-foot level the idea of standardization makes sense; however, we're struggling at the 10,000-foot level because everyone's definition of standard is different. In an attempt to resolve these differences, the well-intentioned, engineering-minded stakeholders on our effort have held it hostage for months. With operational definitions in place, this would never happen. I'm working frantically with leadership now to get these definitions in place so we can dislodge this roadblock from our important initiative.
Using operational definitions to protect your schedule
Operational definitions can save your product development schedule if you're intentional and proactive in establishing them. For instance, going back to the accuracy of a prediction engine, your data scientists could be working for months on a prediction engine that has already satisfied your criteria for success. They would never intuit that however, because to them the more accurate the better. And they'll quickly justify that time spent with rationale similar to, "it's just not good enough." So define what's good enough in advance, so there's no question.
To do this, you must take a decision-based approach to your product development. Drive your product development based on the sequence of decisions that need to be made. You can even build a decision tree that branches based on events and probabilities. Nonetheless, you should have a clear map of what decision needs to be made and when. Now, create an operational definition of what success looks like for each decision.
Each operational definition has three things: criteria, test, and evaluation. The criteria clearly defines what success looks like, the test is the operation performed to determine whether or not the criteria are met, and the evaluation tells you whether it's okay to move on.
Operational definition by example
Let's go back to the level of accuracy needed in our prediction engine. The first question is whether or not it's an effective use of time to even have this discussion. If you're at the very beginning stages of product development and trying to define the opportunity with your target market, you shouldn't be discussing prediction accuracy at all. Data scientists have a habit of latching onto an interesting topic and running with it. Having a clear roadmap of which decisions must be made helps curb these discussions.
Assuming though that this is the appropriate time to develop an operational definition of prediction accuracy, it might go like this. To develop your criteria, survey your competition to understand how accurate they are with a similar product. After some benchmarking, you might determine that their prediction accuracy is around 62% with a 95% confidence interval of six points (between 59% and 65%). With that, you decide to strategically position your product at a 75% prediction accuracy with a 95% confidence interval of four points. This is a breakthrough strategy. To complete your operational definition, task your data scientists to devise an automated test that evaluates when your prediction engine hits this point. Once it does, it's time to move on.
A classic trap that analytic leaders fall into is death by perfection. That manifests itself by long, enervating discussions about subjects that don't matter. You can avoid this trap by using operational definitions to help your data scientists see the line between success and perfection. Each operational definition starts with a clear picture (as determined by those viewing the picture) of what success looks like. Start building criteria for important operational definitions now, before it's too late. If you don't know what topic your team is spinning around these days, you may already been in trouble.
John Weathington is President and CEO of Excellent Management Systems, Inc., a management consultancy that helps executives turn chaotic information into profitable wisdom.