Emerging Tech

Why are we ignoring Moore's Law?

In 1965, Gordon Moore predicted that the number of transistors on a chip would double every year. He later amended that to every two years and Intel later updated it to 18 months. But if there is this kind of predictability to hardware life, why isn't Moore's Law driving how we purchase hardware?

In 1965, Gordon Moore predicted that the number of transistors on a chip would double every year. That was updated by Moore to two years and updated again by Intel to 18 months. According to Moore, this was a "lucky guess" based on just a few points of data.

As Moore's law is ubiquitous, one would think that it would readily be applied to IT planning. But that would not be the case. In fact, it is quite rare that you see IT planning take into account, not only the need that is met today by a technology upgrade, but the needs that are met tomorrow through holistic consideration of that upgrade.

From ComputerWorld:

"Almost never do people look at processor power or storage capabilities and cost trade-offs and decide, 'What does this mean to us in three to five years?'" says Thomas Moran, systems analyst principal at consulting firm Alion Science and Technology Corp. in Annapolis Junction, Md. "How does that impact our technology refresh cycle? How does it impact training and staffing?"

Moran advises state and federal agencies on risk management and disaster recovery. He believes that by applying the predictability of Moore's Law to their planning, they could better predict when it is time to move to newer technologies that would be less expensive and provide better performance. As an example, he recalls a government office that had decided to maintain legacy mainframe operations in multiple data centers. As a result, he says, operational and maintenance costs have mushroomed. "They're hostage to something that has defied Moore's Law," Moore says.

In such situations, Moran says, decision-makers forget to look at the broader picture. "It's not just that you've got more CPU cycles or storage -- it's that [Moore's Law] has enabled disciplines in other areas that impact you directly." The question his clients are constantly asking is, "What should I invest in?" Pointing to the mainframe decision, he concludes that "even safe bets often end up being problematic."

But is it reasonable to assume that the average IT leader is thinking, as Moore's Law would suggest, exponentially? The answer is probably, "No". According to futurist, inventor and author, Ray Kurzweil, humans are linear thinkers.

"Our intelligence is hard-wired to be linear because that served our needs as our brains evolved, when we were walking through the savanna 10,000 years ago," he explains. ""We saw an animal coming at us out of corner of our eye, and we'd make a prediction of where that animal would be in 20 seconds. That served our needs quite well. We have a linear predictor built into our brain -- that's our intuition."

Even scientists, Kurzweil says, rely on predictive intuition, which follows a linear path. "They have an idea of what's been accomplished in the next year," he says. "And then they think about a task: 'Well, that's about 10 times as hard. It'll take about 10 years.' That's the intuition." As a result, predictions tend to be overly pessimistic or conservative, according to Kurzweil.

According to SanDiego.com CEO Mark Burgess, Moore's Law is not a good way of handling IT planning. "Applying Moore's Law as a planning tool in IT is a little like comparing aging to gathering wisdom," he says. "Because technology changes, [it] doesn't mean the rest of the systems and people around them can, will, should or want to change." According to Burgess, the fastest way to slow down an office is to upgrade it.

Burgess suggests that business pattern decision making after the ascending spiral model of history, covering the same ground implementing small changes that move business forward. When new technologies hit, get the technology as soon as anyone reports success with it so that you can begin the process of figuring out where to fit it in.

There are as many "best practice" approaches to technology planning as there are companies. What kinds of thinking drive your technology? Are you planning on a three year cycle? Or do you just replace as needed and let larger implementations drive the technology decision process?

Ed Cone on the "Dark side of Moore's Law"- Know It All

24 comments
MaeseRalf
MaeseRalf

It has surprised me always how easily the human nature forgets their real dimension. I have little to say about Moore's law, as it is not an actual scientific law. It is a "saying", as you could think about Peter's Principle, Occam's razor or Murphy's law. Nothing more as refrains depicting an intuitive impression on how the things look in this strange world. But what shocked me more in this article is the simplistic view stating that human mind is "linear". Perhaps this means that as "mX+n ergo Y"? Do you think this is the actual reasoning of our mind? I have observed, by own introspection and through test and analysis, that we normally are "multitasking", in a kind of vectorial interconnected process, all the data in the outside, all the memory switches and all the impressions our thoughts develop, to decide the correct solution for a given problem. It doesn't mean that we are "totally conscious", but that we use all those data our brain detects in a given moment. "Intuition is like reasoning, although too much faster", said Einstein. That fake statement of the article sounds me as a kind of "Post hoc ergo propter hoc" reasoning. As the movement of an animal is of linear velocity, then our minding is linear. What a puerility!

Beejer
Beejer

For the majority of computing purchases, i.e. personal computers, Moore's Law is irrelevant when it comes time to plan upgrades. Most of peoples' time in front of the screen is spent thinking and not keyboarding, mousing and processing information. Also, how much productivity difference will a 1 GHz processor speed increase make when you're primarily wordprocessing or answering e-mail? I expect most desktop upgrade decisions are made based on maintenance costs, including vendors' contract length, not on technology capability.

CG IT
CG IT

It costs to damn much to buy all new equipment every 18 months. IT would end up being a $$ pit which in many executive's eyes, it already is

JohnMcGrew
JohnMcGrew

For example, the newest Vista PCs are no faster than last years XP machines, which were barely faster than last decade's 95/98 machines.

Tig2
Tig2

In all of the years I have been in IT, I have seen decisions made that left me shaking my head. Seemingly driven by cost alone, equipment is purchased with little regard to how long it must live functionally in the enterprise in order to reach the ROI estimates that validate the purchase in the first place. Or applications are held on to like grim death because the pain of updating or shifting to a different platform is too great. But then an event occurs that forces a shift and again, little consideration of how long the application will be in use or where the deployed system stands in the technology lifecycle. While not advocating a shift to considering all technology purchases to fit a Moore's Law model, I do believe that it is possible to get more bang for our technology buck. What are your experiences with purchasing new technology in the workplace? Do you think that your office is doing it right? Or do you think that they are making technology decisions using a "Magic 8 Ball"? Edited because Safari messes up formatting.

MikeGall
MikeGall

That has been the case at everywhere I've worked. Essentially the rule was if it isn't in warantee any more then we get rid of it. The reasons are simple: ever had an old computer that you needed to get a driver for? I have and spent 3 hours finding a graphics driver for it. My time goes for 40+ an hour, keeping that 32MB graphics card just cost the price of a new one. Add to that reliablity. It is easier to plan on replacing a workstation every 2-3 years then it is to have them break more frequently and unprodictably. You need more IT staff to fix and more time giving replacements/users not able to work. With a workstation costing ~5k (including software and service) and billable time being what it is, you don't need many hours lost to pay for the new computer.

normhaga
normhaga

Comes from many things. As one man pointed out there is a never ending feature request. Is this the reason for bloat? IMHO, no. I remember when Lotus 123 over the course of a year switched from assembly language to C, the size of the spreadsheet grew from one floppy to four with no real added features. Higher level languages are quicker and easier to develop with, debug, and maintain, but even optimized compilers add significant bloat. Less code to execute is why older OS's were faster than later generations. In the coming world because CPU speeds are plateauing and Moore's law is becoming irrelevant, the greatest gains in speed are to be gotten from low level languages. However, the additional time required for development, debugging, and maintaining conflicts with the corporate want for even more profit. Also there are few assembly programmers today.

MikeGall
MikeGall

They look prettier doing it, isn't that the whole point :) After all people can only type/mouse click etc so fast. Now for servers yeah you what as much performance as you can get but that is because the decision making is being made automatic or at least the pool of users has become much larger so the number of "clicks" the system has to handle requires you to focus on performance rather than looks and even easy of use.

jack
jack

the machines are doing a lot more behind the scenes than they did before. Code bloat is a direct result of the never ending feature request cycle. I'm a multimedia developer and I can't tell you how many times a client with a nickel and dime budget has asked me why they can't have a feature they saw on YouTube or National Geographic. We can add it, but they got to pay for it and more we add, the more they want. It's good business, but it's exhausting sometimes. But anyway, that's how a simple checkbook balancing program turns in to Quicken or something like that. True some code bloat is the result of inefficient programming - saw a lot of that in school - comes from laziness and being at the edge or your expertise. However, I think it is outweighed by the feature request cycle.

MaeseRalf
MaeseRalf

This seems to me a dream in the Elm Street movie. Modern world is a nightmare of possibilities exploited by marketing to lure dupe people wishing utilities they'll never use. What you need in a phone? Be able of doing calls, and speak! Isn't it? But I'm sure you are wearing a mobile in your pocket which is able to take shoots, play MP3, and keep a record of the groceries-list. Even you see commercials where a mobile is presented by its features as recorder-player-photographer-notebook, and not any word about how it works for phoning. That is the way the joint-venture Technology-Marketing is transforming the world; from the "What is the need?" to the "What is possible?" paradigm. And now we are grieving we are in crisis! What we hoped then?

The 'G-Man.'
The 'G-Man.'

As in: The majority, say > 50% ball does not need to cross the line. ?

svasani
svasani

Just an example: We were planning to switch to Vista (btw it also fell with our 4 year cycle). But then considering the long and expensive migration from XP to Vista and rumors of an early release date on Windows 7, we decided to just ride it through and wait for Windows 7. If we had gone with our own judgement, ignoring Mircrosoft's plans, we would have just invited more pain. Bottomline: Even though we have a timeline in mind and have budget planning, it makes business sense to plan in conjunction with the major vendors and their Release dates.

Four-Eyes
Four-Eyes

How many I.T. decision makers have actually heard of, much less understand Moore's law? I actually know a few and it's just the tip of the iceberg. Here's another thing: A lot of those folk tend to think this way: "If it aint broke, don't even think of messing with it."

LocoLobo
LocoLobo

Management weighs the cost of upgrading systems, then decides whether that will cut into upgrading their fleet. Guess who loses. Isn't Moore's Law approaching its limits? See this article: http://news.zdnet.com/2100-9584_22-5112061.html The reality is with each new technology it takes us some time to really get full use out of it. It always seems to me that by the time I feel like I know what I'm doing that's when we move to the next level.

magic8ball
magic8ball

I try my best to gauge how long a technology will be viable for before introducing it into any network. However when the owner of one of your best clients just 'has to have' an iphone, then what can you do? You smile and integrate it.

w2ktechman
w2ktechman

Where I work we have moved to a 'less than support' model, in a cost savings plan. Yes, LESS than support!! -- How you ask? Easy. The helpdesk does nothing except to tell you to figure it out on your own. If it is an ER item, they are more likely to hang up than offer any assistance. And, if you do get someone that will actually try to help, more often than not, they make the problem worse before telling you to fix it yourself. IT has become THE problem, not a solution finder. In fact, the employee turnover rate in my dept. is higher in the last year than the previous 4 years combined. While not solely do to inadequate support, it is up there on the list pretty high as to why people leave. Recently, I almost got canned because I stated that the support structure sucked. I would be willing to bet that 99% of the employees would agree with me, however, I stated it with the wrong person around. Got a call from HR about it... LOL, before the meeting started to determine my fate, I also had my resignation ready to go out. So, where I work, the 8 ball is the only solution for support

A_TECKIE
A_TECKIE

Because they alone have access to the magic '8' ball, only they can foresee the future and the future needs of the organization ( insert screaming crashing here Etc. )

MaeseRalf
MaeseRalf

The old fashion to write programs is good enough to be teach in modern schools. I figure that even structured programming is not considered in current schemes. In electronics, the development of FPGAs, Controllers, and other MPD has brought a completelly new way of designing circuits. But you don't find Assembler today in electronics labs. You have libraries, C++, Labview... Even you can find someone using Mathematica to develop their code. As with language translation, modern techniques are faster and easier, but they can't think as humans, so they repeat structures and statements, with the growing code that comports. Is not coming the time to re-consider Prolog or Lisp? "Let the code write the code" is the statement that today seems to reign; then: Why not go back to logic? Modern hardware is powerful enough to get giant gains in speed and efficiency, but money and market don't care about that. What a modern computer might do with DOS?

MaeseRalf
MaeseRalf

Modern computing architectures are, in fact, parallel processing devices. It seems IT industry hasn't realized this yet. The reality is that hardware and software are in different worlds. The development efforts of software producers are linear-oriented for applications. Even more for OS developers, like Microsoft, who are thinking in systems which assumes all the tasks, merely passing-through data to the peripheral devices. On the other part, hardware makers are placing more and greater processing power in those peripherals. Now you have systolic or vectorial processors in your graphic cards. Some of them has more than one core in their circuitry. USB has brought intelligence into even the mice. But the approach from software makers still remains in the old VonNewman concept. So, taking advantage of that built-in intelligence is a very complex intent, that most of the programmers simply could not undertake. That's why Assembler is not the affordable tool it was before. Simply, you can't take trace of the different branches a project might cause. Then, new applications are done using legacy libraries, procedures and functions already defined, pieces of code written before for another use... Modern applications are, almost all, a kind of suite with a lot of features, not simply the function they are intended for. Hence, even in the OSF field, are teams of developers who are making the final product. Instead of the old "one product, one programmer" paradigm, we now are enjoying the results of "one product, plenty of features" paradigm. And therefore, considering the Time/costs restrictions, more development effort is solved by more man/hour power; what means more persons for every project to get less time to market. We are now in a change-of -paradigm era, and still using the same parameters to analyze the requirements. It might arrived the time to reflect, and consider if "chain-production" model is applicable to IT industry. To take advantage of current hardware capabilities, we must assume that parallel processing is finally affordable for low-profile use, and regain the old Wirth/Linus archetype. One project-multiple tasks-multiple threads-multiple modules-single programming; and, over all, one only architect conducing the team intimately. This is the same concept used in old, reality-based, engineering by which finest fighters and rockets are produced just today. If it works for our weapon-systems... Why not for our computing-systems? Cost is not the reason, as in a war you need fighters more than dollars, and in the business war you need reliable tools more than accountants.

MaeseRalf
MaeseRalf

Hi Mike! I fear you didn't understood the underlying concept of my words. You are wright, and that was my late statement. It is my fault, as I wrote "can" instead of "can't" (a fingers lapse), now it is edited. If we should have a new version of CP/M, capable of running the modern hardware, computers must be running "warp-speed". Well, in fact they are faster than we can type or read, so I figure most of the time they should be "sleeping". It is beautiful to see pictures in a 3D-like slide show, or open and move windows with a clap of your hands. Nice! But, must be these the only consideration when developing new products? I think the industry has lose the mind, and they are doing the things just because they can, not because it is right. "Pro-Logic" is the hidden concept in my post. But Logic comes from the old Greeks, and it seems to be a matter of the XIX century, now. What A. Turing might think about today?

MikeGall
MikeGall

I can think of several apps I use that got slower for no really necessary (in my opinion) reason. Eg, browsers, bittorrent clients. Sure they added features, some are even nice to have. But I still can open an old browser in a second or so and the new one takes 5-10 seconds. The problem with a lack of assembly programming out there is that there are a huge (perhaps majority) of developers that don't understand how digital electronics works at a low level. Without a good understanding of how things work, your nice developer friendly scripting language can be spitting out garbage as far as instruction efficiency and you don't notice because your hardware is getting faster. Running your website on a new quad core with 8GB of RAM, when the previous version ran just as fast on a 10 year old server, should be a warning sign. Sure your developers are more productive, but at the expense of halfing the life cycle of the hardware (and all the increased IT support that entails).

magic8ball
magic8ball

I have gotten quite a bit of mileage out of that nickname.

Editor's Picks