In 1965, Gordon Moore predicted that the number of transistors on a chip would double every year. That was updated by Moore to two years and updated again by Intel to 18 months. According to Moore, this was a “lucky guess” based on just a few points of data.
As Moore’s law is ubiquitous, one would think that it would readily be applied to IT planning. But that would not be the case. In fact, it is quite rare that you see IT planning take into account, not only the need that is met today by a technology upgrade, but the needs that are met tomorrow through holistic consideration of that upgrade.
“Almost never do people look at processor power or storage capabilities and cost trade-offs and decide, ‘What does this mean to us in three to five years?'” says Thomas Moran, systems analyst principal at consulting firm Alion Science and Technology Corp. in Annapolis Junction, Md. “How does that impact our technology refresh cycle? How does it impact training and staffing?”
Moran advises state and federal agencies on risk management and disaster recovery. He believes that by applying the predictability of Moore’s Law to their planning, they could better predict when it is time to move to newer technologies that would be less expensive and provide better performance. As an example, he recalls a government office that had decided to maintain legacy mainframe operations in multiple data centers. As a result, he says, operational and maintenance costs have mushroomed. “They’re hostage to something that has defied Moore’s Law,” Moore says.
In such situations, Moran says, decision-makers forget to look at the broader picture. “It’s not just that you’ve got more CPU cycles or storage — it’s that [Moore’s Law] has enabled disciplines in other areas that impact you directly.” The question his clients are constantly asking is, “What should I invest in?” Pointing to the mainframe decision, he concludes that “even safe bets often end up being problematic.”
But is it reasonable to assume that the average IT leader is thinking, as Moore’s Law would suggest, exponentially? The answer is probably, “No”. According to futurist, inventor and author, Ray Kurzweil, humans are linear thinkers.
“Our intelligence is hard-wired to be linear because that served our needs as our brains evolved, when we were walking through the savanna 10,000 years ago,” he explains. “”We saw an animal coming at us out of corner of our eye, and we’d make a prediction of where that animal would be in 20 seconds. That served our needs quite well. We have a linear predictor built into our brain — that’s our intuition.”
Even scientists, Kurzweil says, rely on predictive intuition, which follows a linear path. “They have an idea of what’s been accomplished in the next year,” he says. “And then they think about a task: ‘Well, that’s about 10 times as hard. It’ll take about 10 years.’ That’s the intuition.” As a result, predictions tend to be overly pessimistic or conservative, according to Kurzweil.
According to SanDiego.com CEO Mark Burgess, Moore’s Law is not a good way of handling IT planning. “Applying Moore’s Law as a planning tool in IT is a little like comparing aging to gathering wisdom,” he says. “Because technology changes, [it] doesn’t mean the rest of the systems and people around them can, will, should or want to change.” According to Burgess, the fastest way to slow down an office is to upgrade it.
Burgess suggests that business pattern decision making after the ascending spiral model of history, covering the same ground implementing small changes that move business forward. When new technologies hit, get the technology as soon as anyone reports success with it so that you can begin the process of figuring out where to fit it in.
There are as many “best practice” approaches to technology planning as there are companies. What kinds of thinking drive your technology? Are you planning on a three year cycle? Or do you just replace as needed and let larger implementations drive the technology decision process?
Ed Cone on the “Dark side of Moore’s Law”– Know It All