Companies create new products so fast that sometimes things are obsolete before they hit the shelves. This Classic Tip highlights one such item and we mention a few more.
Companies create new products so fast that sometimes things are obsolete before they hit the shelves. This Classic Tip highlights one such item, and we mention a few more.
Today's Classic Tip comes from TechRepublic's Gizmo News TechMail dated 7/5/2000 and highlights a fabulous new technology that completely failed to go anywhere:
Today's Gizmo News 7/5/2000
SONY UNVEILS DOUBLE-CAPACITY CD-ROM
TOKYO (Agence France-Presse)—Japan's Sony Corp. announced Wednesday it
was launching a CD-ROM with double the storage capacity of current
disks to meet demand for heavy-duty digital recording.The new disks, with capacity to store 1.3 gigabits of data, would be
licensed in September in conjunction with the Netherlands' Royal
Philips Electronics NV.
The "double density CD-ROM" was made possible by simple modifications
to current disks, which can only store 650 megabits of information.
Features on the new disk have been further miniaturized, but customers
wouldn't have to adapt equipment because the basic CD-ROM
specifications weren't changed.
Double-density CDs were supposed to solve the portable storage crunch we were finding ourselves in. At a mere 650 - 700MBs, CD-Rs and RWs couldn't store all the data we needed. The double-density CD was going to fix this problem using the same basic technology we'd been using for years in regular CDs.
The problem was that DVDs had already fixed the problem. Holding over three times as much as the double-density CD promised, the DVD standard was the solution to the problem. DVD recorders at the time were prohibitively expensive, but the prices were falling fast, and double-density CDs soon got squashed.
The double-density CD isn't the only technology that didn't last long enough to get its 15 minutes of fame. Here are a few others:
The 2.88 MB floppy
While we're in the storage arena, I thought I'd mention the 2.88MB floppy disk. This was supposed to have been a successor to the ubiquitous 1.44MB floppy disk. And if you thought about it for a second, it was a logical extension. The 3.5-inch drives started off at 72oKB in size, then doubled to 1.44MB in size. Why not save more stuff on 2.88MB floppies?
Because they cost too much. That's why.
The added cost of the new drives, along with the cost of the media, priced them out of the market. By 1991 when it became widely available, it didn't offer enough of an advantage over 1.44s for the cost.
The Intel 80186 CPU
By now you know that the original IBM PC came with an 8088 processor, which was the little brother to the 8086. And you know the next big thing was the 80286 on the IBM AT. That was followed by the 80386, 80486, and so on until Pentium et al. But what about the 80186? Did Intel just skip a number for some reason?
No. Intel did indeed produce a CPU called the 80186. It debuted in 1982. It was a full 16-bit CPU that ran at a blazing 6Mhz. You could get them as fast as 12 Mhz as well. Because of the updates that Intel placed in the chip, it could process data 10 times as fast as an 8086 running at the same speed.
There were actually a few computers introduced with the 80186 CPU, most notably the Tandy 2000. It was considered to be MS-DOS compatible, but not PC compatible. That was a fine distinction, which essentially meant that any software that tried to make direct hardware calls would fail, but if they went through the operating system everything would be fine. Unfortunately, to speed things up, most programmers made direct hardware calls, which caused the 2000 to be reasonably incompatible.
The 80186 failed primarily because of the 80286. It was available almost immediately after the introduction of the 80186. It was initially cost prohibitive, but the IBM PC market was still coming together in the early 80s. By the time the market had settled down, the 286 was cheap enough to be viable and fast enough to relegate the 80186 to the embedded processor market, where it still lives on in some devices.
The Light Pen
Figuring out the best alternative input device to a computer other than a keyboard has been a kind of Holy Grail. One device that was supposed to liberate us from the keyboard was going to be the light pen. This simple device attached to the computer and was used to highlight things on the computer screen. It detects the location of where the electron gun on a CRT is firing and translates that into a position for software.
Because monitors are now almost entirely LCD based, and not CRT based, a light pen wouldn't work in today's computing environment. But it was obsolote long before the LCD came on the scene.
The light pen failed mostly because there wasn't an interface capable of handling it at the time. When it was introduced, the GUI hadn't taken hold yet. Text interfaces didn't work that well. Plus from a functional level, a light pen meant that you had to hunker close to the monitor to make the selections from the screen.
A much easier alternative input device came in the form of the computer mouse. About the time PC makers started considering including a light pen port on their machines (like IBM and Tandy), the Macintosh came out with a full GUI and a mouse. It was lights out for the light pen.
The EISA bus
Of the ones on the list, the EISA bus probably had 14 minutes and 30 seconds of fame. It was an attempt by Compaq, HP, and others to create a standardized bus interface to extend the life of the 16-bit ISA bus while not having to pay exhorbitant licensing fees to IBM for Microchannel.
The Microchannel bus design was IBM's attempt to kill a couple of birds with one stone. First, configuration issues had become a nightmare with PCs. There were limited hardware interrupts and memory locations available, and you had to set such things manually on the card using dip switches or jumpers. The Microchannel would allow you to automatically configure the card when you installed it. It was a form of Plug and Play, but not how we know it now.
The Microchannel bus was also going to be 32 bit, which meant that it was going to be faster and work better with the new 32-bit 80386s that were coming on the horizon.
Finally, the Microchannel bus was proprietary, and IBM thought it would be a good way to wring some extra dollars out of all the impertinent clone makers who were taking money that belonged to IBM. The lowly clone makers had a better idea. They came up with their own product, which did all the same things that the Microchannel bus did, was compatible with the old ISA bus, and didn't come with fees.
The EISA bus didn't last too long however. It was deployed in some servers, but that's about it. The problem was that there was something even better on the horizon — the PCI bus. It was faster, did much the same thing, and included configuration routines that could be handled by the operating system by modern Plug and Play rather than through configuration disks that had to be maintained.
As with everything, the problem is usually timing. Most technology fails because it's introduced right about the time something better is coming out. What other technology can you think of that failed to get, or live long, in the spotlight? Let me know in the Comments section.