I'm concerned that, even among IT professionals (that is, people paid to manage other people's computers), there is often a very narrow view of what is "correct." Some people not only insist that their particular choice of IT tools is the best, they demand that others acknowledge this. That attitude seems more appropriate for the inquisition than a business setting where a practical view of technology would appear to be more productive.
Of course, I've seen this sort of fanaticism before; all of us have. The difference is that I've seen it over and over and over again during decades in IT. But — and this is important — each time it was a different set of "best" tools or "best" solutions being fought over fanatically, most of which are long gone.
(By the way, if you think none of this really applies to you, skip to the bottom to see how personal this can become.)
As an IT consultant, that never seemed to me the best way to serve clients who are far less concerned with what is "best" than with what works and works quickly and cheaply. At one point about a decade ago, the constantly changing IT landscape (and the push to keep up) led several business publications to question whether IT was even paying for itself, let alone showing a profit for most businesses. Return on investment (ROI) was a major buzzword for several years.
I have been around much longer than most of the people whose job is essentially to maximize uptime on corporate networks. Probably for that reason, I have a very different view of some important IT topics because I've seen far too many "killer" apps, marvelous new operating systems, and programming languages that promised to end all vulnerabilities. Or perhaps, it is just that I've had a different mind-set from the beginning.
"Not perfect but Tuesday" makes engineers cringe, but to a business owner the bottom line is what needs to be perfected. Over the years, I have always looked for the simplest solution. I trained as a physicist and often worked with engineers who developed elaborate and sometimes elegant solutions to problems we were facing.
Sometimes the ideas were impractical in the extreme or simply not what I saw as the easy fix. Some would have worked but were too expensive, would take too long, or required people to change the way they did things — that is always a serious problem.
People don't like change. In individuals, that can lead to fanatical demands that everyone recognize one favorite OS or application as best. The person making the demand may simply be avoiding change by defending his or her status quo.
In companies, it will be difficult (and expensive) to implement major changes in the way executives and office staff work. Management often sees the "best" fix to a problem as the one that disrupts least.
Here's a simple historical example of the pragmatic approach to security that shouldn't raise the hackles of anyone because it doesn't involve any OS or application. Back in the '60s — when we had a major security scare, probably the first in commercial IT history — over the possibility that rioters could disrupt our service bureau operations. We were doing a lot of work for the government during a very unpopular war. This wasn't a far-fetched concern; riots in Harvard Square and the shootings at Kent State were still fresh in our memories.
Proposals were made to put bars on the windows, install alarm systems, barricade the front door, even hire full-time security guards. But those were long-term fixes or weren't in the budget.
I suggested we take the big sign out of the window, the sign that identified us as a computer center. We had no walk-in clients, and our customers knew our location; the sign was just an expensive frill — it served no practical purpose except as a target.
Not an elegant solution perhaps, but it had the virtue of being cheap. Five minutes after the meeting ended, we were once again just another anonymous business on yet another side street. (Obscurity is the first, but never the ONLY line of defense.)
I usually approach software threats with a similar mind-set. There may be workarounds available, but sometimes it is simplest to just stop using a threatened application until it gets patched and the patch gets tested.
You would think that after decades of new operating systems, languages, and applications that sprang up, become popular, and then fade to obscurity, IT people would have learned not to become too attached to any one concept. Is there anyone out there still using VisiCalc, BASIC, SCO UNIX, CP/M, an Apple II, or MS-DOS in a business setting? But too many people in IT have never used any of those and think what they started using will be around forever. Either schools aren't teaching enough history for them to develop perspective, or they aren't taking the lessons of history seriously enough.
So today, we still have people arguing that their solution is the "correct" one — not just for their situation but for everyone — or that their OS is the "best." That used to be the province of home users and a few wild-eyed fanatics, but the mind-set is penetrating too many IT departments.
I suspect that most of the fanatics have only used one or two operating systems and that their experience in the corporate IT world is probably limited to a classroom along with work experience in only one or perhaps two companies. It doesn't matter how big the company or how popular the operating system — that just isn't wide enough experience to give a true professional any real perspective, although they obviously think it is enough.
When people tell me they have the "best" OS, I recall when any operating system was a luxury and later when DOS was by far the most widely used platform. While I wouldn't go back to those days, remembering how popular MS-DOS was reminds me that both Linux and Windows have only been with us a relatively short time, and, if history is any guide at all, might be gone in another decade.
Is your programming/scripting language the most secure? The easiest to use? The most productive? In short, is it the final word in programming? How about your favorite browser? Just how long has it been around?
Do you really not understand that Lotus 1-2-3 and VisiCalc users used to argue the same way? Ever used either one?
Today's Windows vs. Linux argument may someday seem as meaningful as the heated debate between WordStar fans and MultiMate or XyWrite users, and neither Windows or Linux will be in mainstream use any longer — history is the best predictor of the future.
I remember when the whole world was supposed to change over to the Ada programming because it was designed from the bottom up to create the most reliable and secure applications possible. Its use was even mandated by the federal government, which is about as big a backer as you can have.
Anyone out there used Ada lately? Probably not. The compiler was awful, and you were lucky to write five lines of good code a day.
But, lest my point be entirely lost in the "noise" over OS or browser wars, let me remind you that, as IT professionals, our job is to provide what is best for the user — not what seems at the moment to be best in some theoretical fantasy world where we have unlimited budgets and users learn every new application on their own time in five minutes. Keep that technology for your home system where you can gloat over how superior your system is.
In the real world, change is expensive but also inevitable, and a tipping point is always reached where massive changes make financial sense. On a personal basis, looking back at the brief history of IT, you need to ask yourself: Is OS (or app or browser) fanaticism the best way to serve your users?
IT is still in its infancy. Compare it to the automobile industry where people had to build their own cars at first, then companies made a few models that most people bought. Eventually fancy upgrades such as automatic transmissions were introduced to let more people use them more easily. Those became so popular that it is difficult to find a car without one today.
People complain about the expense and the low mileage in part due to weight (ever tried to pick up an automatic transmission?), but they don't consider going back to more efficient standard transmissions (yet!). But faced with a looming crisis, there is serious talk about turning back to a once-popular technology — the electric car, a technology beaten out by internal combustion a century ago.
In another decade, will we see wisdom in turning back to simple, tiny operating systems that just let a few applications run quickly and securely? When everything is a Web app, will it matter what OS your client system is running? Will you be using a new browser tweaked to run Web apps?
No muscle car enthusiast of the '60s who fought over the superiority of Chevy vs. Ford could have foreseen the return of the electric car. Not many of them are designing cars these days either.
Things happen faster in the computer industry. It took 100 years for the internal combustion engine to start looking a bit impractical. It may only take five or 10 more years to make Windows and even Linux start to look ancient. What will happen to those who are today too fanatical in their views to even consider using other software when it makes sense?
Me, I'm more practical. I didn't fight progress when assembly language was developed. COBOL was a great step forward. UNIX and C looked pretty good compared to the alternative. BASIC had its uses. The PC seemed like a good idea at the time. I even worked with Ada a while.
How long would I have lasted in IT if I had insisted that machine language and punch cards were the only "correct" way to program computers? Lots of IT workers were unable to move from mainframes to the PC. Today, the Internet stores our every comment forever, and with more and more employers doing background checks on the Web, just how much of a survival trait is fanatical defense of a particular way of doing things?
Does being narrow-minded and right-fighting serve either your employer (or clients, in the case of consultants) or your career? I have certain things I believe in and won't compromise on, but some brand of software just isn't important enough to fight over — I know it won't be around forever. Keeping an open mind and an eye on the practical solution that is best for the "bottom line" has also served my clients well over the years.
So, you need to ask yourself: Does IT fanaticism get in the way of serving employer(s)? Does it narrow your own potential in this most pragmatic of technological fields?
As a boss, would you rather hire/promote a fanatic, or someone who approaches problems pragmatically? And, remembering that his or her brand of fanaticism may not match up perfectly with yours, which boss would you rather work for?