Networking

Apple does not determine the fate of technology standards

TechRepublic member dcolbert examines Apple's discontinuation of a few technology standards, including the floppy drive, the built-in modem, and FireWire. He argues that Apple's precedence doesn't have a major influence on the IT industry.

In the March 2009 issue of Wired Magazine, Steven Levy wrote an article entitled "An Ode To Vintage Ports," where he discussed the decision of Apple to remove FireWire ports from the Macintosh. He offers some interesting insight about how I/O technology continues to change, evolve, and become more user-friendly, convenient, and inexpensive.

But that isn't what caught my interest in this article. Instead, it strikes me as a great example of what I'm going to call the "Apple Bias" in the media in general. Apple seems to enjoy a certain untouchable status among journalists.

In the article, Levy claimed, "It's often [Steve] Jobs who pulls the plug on a fading port or standard – FireWire is just his latest victim."

You simply cannot elevate the art of Job Idolatry much higher than this. Levy must envision Jobs sitting on high upon a throne in a mansion somewhere in the San Jose area, deciding at whim which technologies live and die. Did you get this memo? FireWire is dead, and it’s because Steve Jobs decreed it so.

Not surprisingly, Apple also considers itself in charge of the fate of technology standards. Levy quoted an Apple spokesperson, "We're often the first company to adopt innovative technologies, and also often the first to discard them." To be fair, this is true. Apple likes to gamble on the bleeding edge, adopting new, unproven technologies and then throwing them out when it appears that they aren’t going to be winners.

If you exist in Apple’s world (which is less than 20% of the total market share), then you might see things this way. But the truth is that the PC market has had far more influence on how technologies have or haven't been adopted, accepted, or discarded. In fact, one thing that you can probably get Linux and Windows fans (the other 80% of the market) to agree on is that is the most significant part of the market isn't Mac.

So, what standards have Apple supposedly nixed? Levy pointed to the disappearance of the floppy drive and built-in modem, and then he said that FireWire was "doomed" when Apple dropped the technology from video iPods. Let’s take a closer look at these assertions:

The floppy drive

The problem with Apple abandoning the floppy was that the rest of the PC world was still using it. This decision didn't cause any real discomfort for PC users, but it was an inconvenience for Mac users. I can't count how many times I would see groups share information on floppies, and the Mac users would be out of luck. “Can you e-mail that to me later? I can't read a floppy – my iMac doesn't even have one..."

PC users didn't care, and the floppy survived for a long time thereafter. In fact, USB floppies are still a reality for some PC users. Ultimately, Apple didn't kill the floppy – it was a dying technology that Apple walked away from prematurely.

The built-in modem

I don’t think Apple can take credit for this one either. Comcast, AT&T, Time Warner, and even Verizon killed the built in modem, not Apple.

FireWire

Apple bet on FireWire when the industry adopted USB. Apple didn't decide to "end" FireWire – it lost to USB. While USB 1.0 and 2.0 didn’t have as much throughput as FireWire, that didn't matter for the things FireWire was mostly used for. With the arrival of eSATA, FireWire became even less relevant.

However, billions of PC users didn't even blink an eye when Apple's new PCs did away with FireWire. They kept using (and will continue using) – myself included – their FireWire-only handicams to film their kids and family vacations on their PCs, which either still have FireWire or offer aftermarket FireWire cards.

When Apple comes out with a new device, Levy has to buy it, including new peripherals. He ended his article by talking about the clutter of obsolete equipment sitting in his closet, joined by his beloved IEEE1394 FireWire devices and cables. Levy said, "One day, USB paraphernalia will join them. It'll probably be Apple that pulls the plug too."

Sure, Apple may very well remove USB from their devices first, but that won't mean that USB is dead, because neither Apple nor Steve Jobs have that kind of influence in the industry. It'll simply mean the reign of the USB standard is coming to an end, like all other PC standards before it.

About

Donovan Colbert has over 16 years of experience in the IT Industry. He's worked in help-desk, enterprise software support, systems administration and engineering, IT management, and is a regular contributor for TechRepublic. Currently, his profession...

32 comments
treerod1
treerod1

After 35 years in the computer industry, the rhetoric of Apple trying to control the industry is not new. It's been going on since 1984 and the media, being no longer journalism, but rather a form of entertainment, wants you to believe this is so. In this case, all I've read and heard from Jobs & company is that they were phasing out Firewire - and not supporting Flash for that matter - because they see something new & better on the horizon. I don't remember reading or hearing Jobs & company say "Hey everybody, follow our lead because we know what's right and because you have to." That didn't happen, but the media would have you believe that it did. Historical fact, Apple does what Apple does, and quite frankly, they generally don't care about what others think. Heck, many times in the past 25 years, they didn't even care what their own fans wanted; BeOS vs the Mac Classic OS, vs OSX. The Mac community was split between BeOS and Mac classic OS, while not many wanted OSX. And, if you know Mac history, you know how that ended. To fully understand why Apple does this, just read their Mission Statement, which hasn't changed over the years. When Apple is wrong about something, they pickup the pieces and move on; remember the Newton, the switch from ADB to USB, or the switch from Motorola to Intel. Apple is not going to change ... I wouldn't be surprised if in 2012 Apple drops the Intel processor and replaces it with their own processor. This is all about the product and having fun being innovative, and has very little to do with gaining market shares. In regards to market share, remember this fact, Mercedes Benz has 3% of the market share for cars, yet it's a very successful and lucrative company. And I'm sure Mercedes Benz really doesn't care about all of you that drive Fords & Chevy's.

Jaqui
Jaqui

since when? go buy a laptop [ other than apple's macbook ] and it has one. as for floopy drives and disks being dead, I still have and use them. [ heck, I have both internal and external iomega zip 100 drives and disks also. ]

dcolbert
dcolbert

I had to actually *look* on the side of my T61 Centrino Core Duo Lenovo notebook, but sure enough, right there, next to the Ethernet port (also a beast with numbered days), is a RJ-11 modem port. (By a standard d-sub 15 pin VGA out port, I should add). I guess the leading business notebook manufacturer in the world didn't get the memo that Apple killed the internal modem 10 years ago... :)

Jaqui
Jaqui

only apple doesn't put them in. Dell, Gateway, HP / Compaq, Acer, Lenovo, Sony, Panasonic [ Toughbooks ], Toshiba all still put internal modems in notebooks.

john3347
john3347

Someone has to introduce new technology to the public. The public depends on the manufacturer to do this. Once a technology becomes mainstream, it is the responsibility of the users to determine when the technology should to be abandoned. If the using public no longer uses a technology, it is time for the manufacturer to abandon it. The using public should not lay down like a sick puppy and let the manufacturer dictate to them what technology is right for their needs.

dcolbert
dcolbert

Do consumers dictate the life-cycle of a product, or do manufacturers? Or is it some combination of the two? Manufacturers have an incentive to create artificial obselecence. I *strongly* believe that both HDTV and BluRay have more to do with limiting the control consumers have over their media content than they have to do with providing a "better" experience. Consumers, increasingly, seem willing to buy into marketing hype from manufacturers and *accept* that this "innovation" is "good" for them, even when it isn't so obvious that it is. Additionally, with HDTV, we see government becoming actively involved in *forcing* consumers to discard old technology and accept new technology - again, technology with benefits that potetially serve corporations and government more than the consumers being forced to adopt that new tech. Was HDTV really about pushing high resolution broadcasting, or was it a way to introduce DRM and other restrictive technologies into consumer electronics? I'm wary of innovation like this.

wdewey@cityofsalem.net
wdewey@cityofsalem.net

I believe that the US government required everyone to switch to a digital standard, which is not actually HDTV. Most digital TV's are HD compliant so that makes HD programming available to people that may not have purchased it otherwise. Digital programing was mandated by the government because there are more options on how to manage the frequency spectrum with digital signals. Since the amount of frequencies that can be broadcast in an area is limited and wireless use is up managing the frequency spectrum better is good for the consumer. I am trying to not get to complex on the explanation but still get the point across. Sometimes people won't give up their old technologies unless forced. Waiting for an entire entrenched industry to move for their own good could have delayed the benefits for moving to digital for decades. Bill

dcolbert
dcolbert

And, it may very well be the truth. The problem is, that the government and the corporations they are cozy with use opportunities like these to implement all kinds of restrictive technologies that never actually worked when they tried to "reverse engineer" them into legacy technologies. Too often, I see industries and corporations try to "put the genies back in the bottle" with emerging technology. They offer you a carrot (High definition images, wide screens, LCDs) - but they take away freedoms and implement restrictions that serve their own good, not the good of the consumer. When you're *forcing* a move forward on technology but a side-effect is that it is going to be erode consumer influence on a market, and you're claiming it is for the *benefit* of the consumer... it looks suspicious, at the very least.

techrepublic
techrepublic

The only problem with Apple abandoning the floppy was that 80% of the PC world was still stuck with it it. This INERTIA caused real discomfort for PC users for years after. To flash your BIOS you are going to have to find one of those damn floppy things, a Working one yet, and format it, and wait ages while a pathetically small amount of data is written, verified and read. Ditto the nightmare of parallel ports, PS/2 mice, RS232 etc. It was obvious that USB was going to triumph back when Apple moved, kudos to them.

dcolbert
dcolbert

I'm not sure how this was a problem for anyone except Apple users. Here is the thing, the PC world went on, using floppies, for flashing BIOSes, for creating boot disks to install OS platforms from CD, for copying small files via sneaker-net, doing all the things they did, for years after Apple EOLed the floppy. And those PCs still cost *less* than the Apple devices - so what was the net gain of *not* having the floppy device in your Mac? Well, it saved APPLE millions on floppy drives and increased their profits. But (hypothetically speaking) it made it a royal PITA for Sue to get her Microsoft PowerPoint presentation off of her Mac and into the machines at the university lab, and because of that, she got a D on her final and ended up working at McDonald's while all her friends with PCs got high grades on their Windows powerpoint presentations and went on to comfortable jobs in middle management. Apple's aggressive adoption and discarding of technologies generally only hurts Apple's customers. Apple didn't kill the floppy - as a matter of fact, the very fact that it lived on so long after Apple stopped using floppies indicates that Apple was premature in removing the device from their equipment. Same for parallel, serial, PS2 ports, and dozens of other "legacy" interfaces that have had slow, lingering "deaths" as PC equipment.

ScarF
ScarF

In Steve's dreams, maybe. The figure isn't close enoough, not even for the US market: 7.4% in 2009. Robert Cihra, analyst with Caris & Company, predicted in the end of 2009 an increase of 26% for Mac, giving Apple a total of 4% market share in 2010. Worldwide, even ASUS or Toshiba are doing better than Apple with 5.5% each. And, this optimistic prediction was published in AppleInsider. IMHO, from 3% to 4% in one year, of course may be perceived as a huge increase. The rest of the PCs - with more than 90% should beat 100% to have the same increase. Now, this is the kind of BS propaganda used by Apple to blow our minds. Fact is that the personal computers' market is projected to double this year. Now, this is an interesting figure: for each Mac sold, there will be almost 20 non-Apple PCs sold. Eat this, Mr. Jobs!

dcolbert
dcolbert

I was going off older data that had said that Apple was approaching 18% of the total market share (Linux at about 2%, and Windows with the rest). But I've seen all kinds of numbers, depending on what you're using to quantify your data and what results you're interested in producing. I was being generous. I've argued before that I think these facts and figures are virtually useless (for some of the reasons I allude to in this paragraph, above). I think that basically what you *see and experience* in the field is the best indicator of where those numbers lie - and I very rarely go somewhere and see a bunch of Macs, and almost NEVER go anywhere and see a bunch of Linux machines, but almost EVERYWHERE I go, I encounter Windows systems. So, in a nut-shell, I do not disagree with you on this observation. Who really knows for sure.

WLaddR
WLaddR

Apple introduced the personal computer to the 5 1/4 floppy disk when Woz wrote his first DOS over a weekend after seeing one at a trade show. Apple introduced us to the 3 1/2 disk with the first Mac. They may have been out there but nobody was using them. Apple brought SCSI to the personal computer while the PC world ran on Parallel ports. Years later it replaced SCSI with FireWire, which it helped develop, when a cheaper/faster Bus was needed. Apple was the first computer manufacturer to actually use USB when it put it in the first iMac to replace its aging ADB interface. Yes, USB had been on PC mother boards for a couple of years before this but no PC builder was even pulling the port out to the case. A couple of USB trackballs were all that existed until Apple put its keyboard, mouse and printers on USB in the iMac. Within months there were dozens and then hundreds of USB devices of all sorts available for everybody. A few years ago Apple dropped the floppy from its computers. Hard drives were getting bigger and cheaper and USB flash drives were on their way up. You could no longer back up your HD to floppies, it took hundreds, and software was all being delivered on CD, and later on DVD. After a little grumbling most Mac users realized they could get along just fine without that tiny floppy storage medium. Apple was the first to dump floppies but the industry quickly followed. Sony has announced recently that it is ceasing its production of floppy disks--so long, good riddance. Apple dropped FireWire from the MacBook. All its other computers have been upgraded to FireWire 800. Not exactly deserting the interface; if you're going to complain about system improvements then you should have really raised a stink when USB went 2.0. Apple's never claimed to "invent" every feature or interface or innovation it's put on its computers over the years but it sure is hard to find anything new or innovative that they didn't install first and make popular enough for the industry to catch up to. Other that burnable CD's, which Apple will be the first to admit they didn't see coming, there are damn few "features" we all take for granted that they didn't first bring to our attention.

lastchip
lastchip

This article is well argued and is probably a lot nearer to the truth than Apple's rather high opinion of themselves. At the end of the day, I won't use Apple's overrated and over-priced products. I use what gets the job done at a cost effective price. Apple, no more than Microsoft, will dictate to me what I use and how I use it. The only thing that will dictate what I use, is when third party hardware vendors no longer produce that item, and then I don't have any choice.

Jtempys
Jtempys

Whats funny is.... None of these technologies are new....or innovative. Was quad core Really a innovation? was intel 4 really an innovation? The truth of the matter is that these technologies have been known for quite a long time, that companies Chose not to release these technologies from the jump so they had room to "innovate" and keep consumers coming back. The truth is electronics as a whole are obsolete, the simple physics do not allow for further expansion of speed or further reduction of size, hence Quad Core (ominous angel desecnt from heaven music) etc. No none of these companies innovated these things, they simply reap the profit of selling them. This conversation is moot, and rather short sighted. If apple was truly innovative, the iPad would be a photon computer with holographic display as that is the current innovation.

Vulpinemac
Vulpinemac

This is quite obvious by the fact that the iPad is using essentially ten-year-old technology--hardware wise. However, unlike the tablet devices that ran before it, the iPad innovates by making it something people want to use, which no predecessor could really do. Even the Newton, the first of the PDAs, lacked the simplicity that is changing mobile computing and may change desktop computing as well. Microsoft claimed innovation when they created their 'Tablet' version of Windows, saying "Everybody will be using tablet computers by 2005!" Regrettably, it fell as flat as leaky football in a championship game. They tried again the next year, with the same result; nobody was using tablets except for a few special-purpose applications in very narrow venues. It really wasn't Microsoft's fault--much; nobody wrote tablet applications for a mouse-and-keyboard operating system. Maybe if they had--maybe if they'd forced the 'touch' environment, Windows Tablet would have taken off. However, by eliminating the mouse and keyboard option, Apple innovated by 'forcing' the touch environment in the iPhone first, then in the iPad now. As we look today, everybody is scrambling to copy Apple's style. Microsoft's attempt to re-start its 'Windows on a tablet' concept and pre-empt the iPad fell so flat yet again that even HP, Microsoft's partner in the venture, dropped it in favor of obtaining a touch-based OS that could truly show off their tablet's capabilities. And when HP, one of the biggest players in the Windows desktop environment abandons Windows for a computing device, you know they're seeing the writing on the wall. So no, in this case the innovation isn't the hardware, but rather how the hardware and software have been integrated into a whole that completely changes how things are done. That is what innovation is all about.

dcolbert
dcolbert

Understanding markets matters. If you released a photon computer with holographic display today, but it had a lousy user experience, an iPad running a single core A4 1Ghz CPU would outsell it. Stats on paper are great, but Apple's genius is in how they deliver a user experience. Too many tech companies get caught up in the specs and stats, and ignore the experience.

Brenton Keegan
Brenton Keegan

Actual scientific development of computing is light years ahead of the consumer market. Businesses make more money by releasing products more gradually. You release a photon computer and you lose out on years of profiting from electromagnetic technology.

Vulpinemac
Vulpinemac

I agree with the sentiment that Apple does not declare a given standard 'alive or dead.' However, it seems that Apple is the first company to recognize when that standard is dying of obsolescence, often years before others see the writing on the wall. [b]1) The Floppy Drive[/b] Apple dropped the floppy from the original Bondi Blue iMac and nearly every analyst declared it a major mistake, that the floppy is necessary to everyday computing. However, by the end of the 'gumdrop' iMac run, very few manufacturers were including a floppy as standard equipment--they were an order option in almost every case. Why? Because just as the 5" floppy became too small to hold application files, the 3-1/2" floppy became too small, emphasizing the need for multi-megabyte capacity offered best by the CD-R/RW. No longer constrained to multiple disks to load/transfer large file collections or applications, a single CD could hold the equivalent of roughly 50 floppy disks. At least for a while, this also became a security advantage, reducing the risk of malware getting installed by a corrupted floppy. [b]The Built-in Modem[/b] Again, Apple didn't say 'yea' or 'nay', rather, they were first to recognize that the internet wasn't going to rely on antique copper wire much longer, though there would remain a need for a while. Considering how rapidly modem technology was changing, it became both easier and more cost-effective to make the modem an external device full time. This allowed the company to marginally reduce the cost of the computers and ensured that only the people who needed a modem would receive one. One of the things I like is the fact that Apple's external modems have tended to be the physically smallest, making it easy to carry one in the event broadband is not available in a mobile environment. [b]Firewire[/b] Honestly, I'm disappointed that the Firewire standard seems to be fading, but I don't blame Apple, instead I blame the camcorder manufacturers who have embraced USB over firewire when it is virtually impossible for a USB connection to reliably transfer a video file from the camcorder to the PC (and I don't care what brand of PC you're talking about.) Apple recognized that the manufacturers were dropping the standard and simply chose to accept the inevitable. All you have to do is look at every consumer video recorder manufactured within the last 5 years and count the number who have a Firewire connection on one hand. Granted, pro-grade cameras still have Firewire, but you can't find a single consumer-grade device currently on the market carrying that standard. Again, it's not Apple that's abandoning it, but rather the device makers who could best benefit by keeping the standard but choosing USB instead. Now that hour-long video files can be stored on a 2GB memory card, it's easier to copy the raw files into the PC and let the editor convert to the format of choice. So no, Apple doesn't declare a standard dead, but it is usually the first to acknowledge the change and adapt to it; keeping Apple at the forefront of technology standards.

wdewey@cityofsalem.net
wdewey@cityofsalem.net

I personally think that flash drives killed the floppy not CD-RW. I know very few people that transferred files between computers on a CD where as I commonly see people transfer files with flash drives. Floppies started becoming optional as flash drives picked up popularity. Bill

Vulpinemac
Vulpinemac

I believe the floppy was already being phased out when the first flash drives hit the market. More often, you were looking at Iomega Zip drives and other hot-swappable hard drive concepts. They died themselves due to reliability issues (I couldn't use a Zip disk more than about 5 times on average, though I'm sure others did better.) Meanwhile, CD-Rs were pretty much booming and became the 'disposable' media of choice until the Flash drives got bigger and more reliable. I still have a 128MB flash drive--sometimes it works fine, other times it's DOA. Now you can get multi-Gigabyte flash drives that are even easier to use... and sometimes even less reliable than my old 128M. I may be wrong, but if so, not by much.

dcolbert
dcolbert

I'm OK with Apple taking credit where they actually did influence the choices of other companies. While the Newton was the first commercial "PDA", it was a commercial flop. Palm was born of the Newton, and they made PDAs commercially successful, leading to the iPaqs, and eventually, the iPhone and Android smart-phones we love today. There is a direct evolutionary line to the Newton, for sure - but that doesn't erase the fact that the Newton was a turkey, a stinker, and a commercial flop by anyone's standards. But if Apple wants to take *that* one and make lemonaid out of lemons, I'm ok with it... But Apple killing the floppy didn't influence... er... squat. Neither did Apple adopting USB. This is just so much wishful thinking and self-serving "credit-taking" where credit isn't due by Cupertino. That is the problem I have with Apple taking credit for, (and the Cult of Apple spreading those ideas) things that it really had no influence on what-so-ever. We're really arguing something different here. I'm saying Apple and USB, makes not a lick of difference to the success or failure of USB - and Apple doesn't get *any* credit. They were forced to adopt USB, plain and simple. I'm saying Apple dropping the floppy, makes not a lick of difference to the decline of the floppy. They did it, it didn't really affect the decision making of anyone else in the industry, for practical purposes.

Vulpinemac
Vulpinemac

... but from different directions. Apple didn't necessarily [i]say[/i] 'this technology is the next best thing' or 'that technology is dead,' but they tended to be the first to adopt or drop that technology, influencing the choices of other companies. Now we look at tablet technology, and again Apple wasn't first, just as they haven't been first with any previous technology, but they are quite visibly influencing how other brands approach tablet computing. Quite honestly, you could say that Star Trek, not Microsoft, was the original influence and Apple is the first to make a reasonable facsimile. Working backwards, the PDA wasn't exactly original when Palm and Handspring created their devices, they patterned after the Apple Newton, which itself was patterned after a concept from about five years previously--but Apple's, in this case, was the first commercial device that I know of, and was quite successful as far as it went, though admittedly it became an extremely niche product in the medical field. Until now, no other PDA has really come as close to the original Newton's capability, and the new tablets look to make all of their predecessors look like a Stanley Steamer by comparison.

dcolbert
dcolbert

Your timeline may be accurate, but may not paint the entire picture. While small thumbdrives were available - they were expensive, had limited capacity, and were not as convenient for end users as floppy disks. Early thumb-drives were an expensive, "all the eggs in one basket" proposition - when compared to a single floppy or a burned CD. Burned CDs were a hassle (and they were relatively expensive) compared to the easily reusable, more portable format of a 3.25" floppy. I think your mention of the Iomega zip actually highlights something - it was a time of storage transition. It wasn't APPLE that decided that the floppy was nearing the end of its productive lifecycle, the industry was - which is why so many alternatives were cropping up during that time period. Apple killed the floppy before any of the alternatives had established themselves - and who knows what the impact was - it was negligible because Apple's market-share was negligible. Now if the entire PC industry had killed the floppy at that same time, we would have had something that could be measured. Apples has the freedom to do these things for a number of reasons. They had hardly any market-share (and really, still don't), their market is "trapped" or "locked-in", their customers are incredibly loyal and tolerant of changes that customers of other PC platforms wouldn't put up with... Apple was premature on killing the floppy on their line - and it didn't affect anyone but Mac users, and didn't have any impact on the rest of the industry staying with or killing off the floppy themselves. If it makes Apple customers and Apple feel better to kid themselves about things like this, that is fine. But the reality is... Apple's decision to kill the floppy on iMacs was insignificant in the life cycle of the floppy drive.

dcolbert
dcolbert

And Apple "killed" the floppy too far before the USB drive was really a viable alternative. We forget looking back, but my first USB drive was a 128mb model which was beefy compared to the 34mb models that were popular at the time. Now I've got a 32gb thumb-drive. USB drives were certainly the stake in the heart of the floppy as a sneaker-net device, though.

dcolbert
dcolbert

Listen, which is like GM saying, "Fossil Fuel is going to go away, because of this we're simply going to end all of our fossil fuel burning internal combustion engine vehicles, because it is a dead end technology". Today. A fan of this idea would call it forward thinking progress. I think the GM board of directors would call it "premature". Now, in this scenario, if GM were like Apple, when eventually the automotive industry as a whole moved from fossil-fuel burning internal combustion engines, GM would say, "We're the ones responsible for this industry move, because we did it first". Heh. There we go... Apple is the forum poster that wants to post "First", just for the bragging rights of doing so. "I was here FIRST, I did this before anyone else". There is no "first" that Apple claims for which Apple has actually been FIRST with. The GUI, the Mouse, Color graphics, and every other thing they've brought "first" to market, was done *first* by someone else - often commercially as well (lots of people will argue that Apple has been the first COMMERCIAL use of these technologies). Apple's headlong, often premature rush into first has frequently not been to the benefit of the consumer - being on the bleeding edge of Apple technology in particular, can be one of the worst and most expensive bleeding edges to find yourself on.

willmuny
willmuny

Of course ! FW disapear to oblige people to buy pro versions to make video even if a simple MacBook is OK

kgretton
kgretton

FireWire isn't dead. It is just no longer relevant to most users. The user of a FireWire based camcorder is unlikely to have the latest Mac. If they do then their camcorder is probably of a similar vintage to their Mac. Apple still supports FireWire on all but the MacBook - its the faster FW 800 version but that is backward compatible. Apple is not a follower. It does not follow the PC industry. It makes its own trail. When Apple adopted USB on the iMac back in the 90's, it was not even supported on Windows. I remember having to wait until Windows NT 4.0 to get even the most flakey support for USB whereas it just 'worked' on the iMac. mini DisplayPort is a similar story today. Apple has innovated because it has to for the connector form factor, etc.

nonseq
nonseq

they just call it iLink or something

dcolbert
dcolbert

And it doesn't explain why the USB standard, an interface made by a consortium that included Microsoft and Intel, but not *Apple*, would be adopted "first" on iMacs, when Apple was clearly behind FireWire at this point. This is Mac revisionist history. USB is a PC standard made by PC manufacturers: http://en.wikipedia.org/wiki/Universal_Serial_Bus USB would never have succeeded without the momentum that the PC market gave to peripherals that support the USB standard. If you doubt it, just look at the success that Firewire met. And that the mini DisplayPort is meeting today. There is a much loved interface in the industry. Or ADB... there was a great interface. How about NuBus? It seems to me that Apple's non-standard interfaces have only every had modest success *only* when the larger PC community as a whole has adopted them. (Or when Apple has adopted an interface that came from the PC world and then claimed it innovated. I'm surprised Apple isn't trying to claim that it was the first PC to leverage Intel architecture technology). Apple innovates - they blaze their own trail, I'll absolutely grant you that. And often, on paper, the technology they back is *superior* to the "commodity technology" alternatives available in the standard PC world. But I think more often than not, this innovation has lead to dead-end technology. It is telling that Mac had to give up almost all of their proprietary "Classic" architecture and adopt PC technology almost across the board, in the 90s, to save themselves from what looked like certain bankruptcy. Intel architecture PCs, AGP and PCI expansion bus, USB... these are all things that the broad PC world was first to market with. Apple had to give up on all of the alternatives. Innovation and "great-on-paper" are sure fire paths to commercial failure in the technology arena. This is a lesson Apple hasn't learned. The time-line Apple is on currently so closely mimics the arc of their first time down this path, I wouldn't actually be shocked to see Steve Jobs depart Apple sometime soon to follow some *other* goal of his... maybe making an overpriced super-computer or creating a 3D animation startup... The point remains, Apple may blaze their own trail, but they do not have nearly the influence on the industry as a whole as they like to think they do. Anyone who buys into this PR from Apple hasn't been watching the company for the last 30 years. Almost every time Apple blazes their own trail, there is a dead-end of poorly supported technology that has an arbitrary end-of-life declared that serves Apple consumers no good at all. Only Apple profits from their "trail-blazing" innovation. Addendum - Vulpine has also made this claim that the iMac is what popularized the USB standard which was created by Microsoft and Intel, among others, but which had nothing to do with Apple. This article - http://technologizer.com/2008/10/17/firewire-isnt-alone-a-brief-history-of-features-apple-has-killed/ which is actually one of those "Look at how Apple innovates and leads" kind of articles, still supports my claim that Apple was *behind* Firewire and *against* USB. They adopted USB because they had to - not because they wanted to.

Vulpinemac
Vulpinemac

[i]"... it doesn't explain why the USB standard, an interface made by a consortium that included Microsoft and Intel, but not *Apple*, would be adopted "first" on iMacs, when Apple was clearly behind FireWire at this point."[/i] I don't think anyone will deny this specific statement; however, the argument is that Apple was the first company to make it their 'standard' connection over all others. USB was 'available' for the general PC market, but until Apple made it their 'only' standard connection, there were very few USB devices available. As such, the drive to make USB devices for Apple kick-started the USB migration for everybody else. You'll need to read my response to the article itself for my comments about Firewire, but to summarize, Apple didn't kill it, the peripheral device makers did. And if you look now, nearly every device Apple makes, barring the iPad/iPhone/iPod, carries a Firewire 800 port to accommodate what few devices remain in use. And thank you for the acknowledgement. Yes, I agree that Apple was behind Firewire--they were part of the group that created that standard. That's why Firewire was the only other I/O type of connector on the original iMac... well, and the sVGA video connector.

dcolbert
dcolbert

The thing is, (and I'm glad to see you've arrived here, Vulpine) - that I suppose it is easy for Apple to make this claim, that USB was "languishing" until the iMac standardized on USB as an I/O interface - but... I think that this overlooks some important mitigating factors. All PC I/O methods take awhile to catch on, while legacy methods die out. That was what happened with PC USB. It just so happened that Apple jumped ship to USB on the iMac shortly before USB started to really catch on. It doesn't matter what standard is emerging (AGP, PCI-X, USB... PCI...) they always spend some time in a new PC as "the thing that there isn't anything made for". Some of them never really take off, (what is the little PCMCIA slot I've got on my HP DV8030 and my S-10? IT is like a half-sized PCMCIA... I've got a GB nic card and a video in card... but as a technology, it is clear that this one arrived kind of DOA)... Finally, claiming that the iMac led to a flood of USB peripherals ignores the fact that the majority of MAC USB peripherals of this era would work with any PC, but many PC USB peripherals (of which there were way MORE of), would not work with a Mac. iMac standardizing on USB as USB peripherals started to replace other peripherals on the market was coincidence. The iMac represented such a relatively tiny portion of the USB market at this point (Apple was at well under 10% market share at this point, possibly under 5%...) that it just doesn't make sense that manufacturers would go, "Oh! Apple is supporting USB... only NOW should we start making USB devices". It isn't how it happened. Apple realized, "Everyone is going to make USB devices and not Firewire because there are a bajillion PCs and not many Macs, so we better put USB on the iMac or all of our home consumers are going to have PCs with no peripherals". :) Now... Firewire was way ahead of USB... and was all Apple... can't deny that at all.