Smartphones

iOS vs. Android race is too close to call

It's too early to declare Apple the victor in the mobile device OS war, according to Donovan Colbert. Read his thoughts and predictions on iOS and Android.

I recently read two blogs that have themes I want to explore with TechRepublic readers. The first was by a TechCrunch blogger who proposed that iPhone on Verizon has stopped forward momentum for Android, and that the iPhone 5 may reverse Android's climb. The second was by ZDNet blogger James Kendrick who suggested that Android tablets just don't have what it takes to compete with the iPad because of fragmentation. (Note: ZDNet is a sister site of TechRepublic.)

Thoughts on the TechCrunch blogger's post

I think most people who've joined the ranks of Android owners are pretty well informed; they understand there are some hassles and headaches associated with Android that Apple device users don't experience as often. Android suffers more apps force closing or not working right. Android does not handle battery management as well as iOS. Because of the wide variety of hardware platforms and device formats, Android has more issues with app compatibility across the wide range of hardware devices than you'll encounter on iOS.

But Apple dragged its feet, and Verizon established Android as a compelling alternative to iOS. We'll never know exactly how or why that all went down, but as consumers, it is where we find ourselves today. (If I were a conspiracy theorist, I could make a compelling argument about this going on just long enough for there to rise a compelling alternative to prevent SEC anti-trust investigations.)

There are two main platforms for digital devices: iOS and Android. The TechCrunch blogger assumes that the iPhone on Verizon stopped Android's growth and that the arrival of the iPhone 5 will reverse it. That would mean a mass exodus from Android to iPhone.

I'm an Android user, and I've got an iPad and an iPod Touch; I'm familiar and comfortable with both platforms. I have no plans on dumping the Android platform for Apple, and unless Apple were to make some changes that I think are *very* unlikely, that won't change. When an iOS device has USB host and SD card support, maybe we'll talk. I think there are a lot of Android users out there who feel the same. While we cannot deny that the Verizon iPhone seems to have momentarily stopped the forward progress of Android OS (and it is possible that the Verizon iPhone 5 may temporarily cause Android to retreat), it would be premature to declare Apple the victor in the mobile device OS war. The same factors that have allowed Android to jump ahead of all other challengers will continue to be the reason that Android grows going forward. Apple and Android fanboys can argue endlessly about this, but the truth is, it is really anyone's guess how it will all turn out when the dust is settled.

As it is today, iOS doesn't do enough to ever be more than an accessory device. The voluntary limitations placed on the iOS platform by Apple and accepted by iOS users mean that iOS devices will always be bound and beholden to have more powerful platforms like OS X and Win32/64 around to do "real" work on. The fact that the same iOS platform that works on an iPhone or iPod Touch remains basically unchanged on the iPad and iPad 2 illustrates this point.

An Android device, on the other hand, begins to bridge the divide between a mobile OS like iOS and a full-fledged OS like Linux, OS X, or Windows and is arguably more robust and full-featured than ChromeOS. Android 3.0 Honeycomb (despite frequently getting a negative response from the press) seems to be a significant effort to increase this distinction for the Android platform. Honeycomb is doing a lot more than traditional Android or iOS. Perhaps the bigger problem is that Honeycomb isn't holding anybody's hand in delivering its capabilities. I've heard lots of complaints in the blogsphere about Android tablets not supporting certain video formats. My experience with the Coby Kyros 7015 led me to a RockPlayer Lite, a full-featured alternate media player that has wider media format support than the bundled Android player. This is the trade-off; in being comparable to a traditional desktop OS, Honeycomb expects the user to be a little more skilled in operating the device and taking the initiative when necessary.

Ultimately, I don't think it matters tremendously whether iOS or Android is in the lead because I believe the lead will be marginal. There really isn't any killer app to lock users into either platform. There is no Microsoft Office as a ubiquitous application that just works better on one platform than the other. Pages and Garage Band aren't going to cause a max exodus. There are going to be a stable and growing number of global Android users. Likewise, Apple iOS users are probably not topped out yet. I think both platforms are probably starting to achieve maturity. Growth will not be as rapid or as dramatic for either, and one may shoot ahead while the other stays dormant. It really is fan-boyism at this point to predict, "the iPhone 5 will reverse Android growth on Verizon." It might, though if it does, it will likely be temporary. Head to head, depending on your priorities, iOS or Android "wins" on the merits that the competitor loses on, customer by customer. I love the built in, free, turn-by-turn GPS guidance on Android; it is a value-add that Apple hasn't met or beat, as of yet. The iPhone 4 has maybe the best mobile camera on the market; no other handset has really delivered a complete smartphone platform with such nice camera optics. Pick your poison. What can you can you give up, and what can't you live without?

Thoughts on the ZDNet blogger's post

The ZDNet blog about the Android tablet suggests that perception and a uniform, non-fractured market are critical to success. I think this misses the point. In the early days of the IBM/PC era, the hardware was fractured and there were issues that the market was working out. You had to know the difference between a clone and a compatible, and what that meant for how MS/DOS and apps would run on your system. There was a lot of confusion and hesitation. The flexibility of the PC platform, the competition of multiple vendors, the freedom of consumers to leverage their equipment without vendor limitations, lead the PC platform to become and remain dominant even to this day. I'm not claiming that a fractured platform isn't hurting Android - I'm saying there is ample evidence that a uniform, non-fractured market is not critical to success. I've said several times that with Android, iOS ,and emerging personal digital devices, we're in roughly the same place as the 8-bit era of personal computing. It is brashly premature to try and predict how this is all going to sort out, and these devices are in their infancy as far as design and features are concerned.

Conclusion

I think it is wishful thinking to hope that the iPhone 5 will mean the end of forward momentum for Android, or that Android tablets will disrupt the success of iOS devices. Neither of these platforms has the leverage to unseat the other. It will take a critical misstep by one to give the other a strategic advantage. Barring that, this will remain a race that is too close to call for the foreseeable future.

Also read on TechRepublic

About

Donovan Colbert has over 16 years of experience in the IT Industry. He's worked in help-desk, enterprise software support, systems administration and engineering, IT management, and is a regular contributor for TechRepublic. Currently, his profession...

40 comments
greggwon
greggwon

The remarkable difference between 3rd party hardware on PCs and Apple's limited choices provides all the stability differences. Apple provides limited choices in hardware add-ons. That limits flexibility, but it provides a better experience to users because driver software can be better when Apple writes it, or at least can quickly work around or fix bugs in what they have written. In the PC market, when margins are slim, there may not be funding to fix something that is wrong with a driver. We all know that drivers are shipped broken for many things because the internet connectivity we have today allows people to get updates installed. Or, a fixed driver is put into the Windows distribution and thus the vendor doesn't have to have a web site and bandwidth to distribute updates. Apple being THE manufacturer and THE sales point for all of their product lines is a giant part of what makes it possible for consumers to have a much better experience. The price points support Apple stores having 10's of people in them. Go to BestBuy and look for a "sales" person who can tell you every detail about a product, or knows exactly who in the store is the expert! It's a night and day difference, and this is what I here from everyone that I know who has switched to Apple products for there computing needs. Single source, single vendor, proprietary products can be highly successful. Do you have a Maytag washer and a Samsung Dryer, and a KitchenAid dish washer? Most people have a single brand set of appliances because they can easily see the value in that.

Vulpinemac
Vulpinemac

Personally, I haven't had the desire to spend thousands of dollars for applications; I do own Photoshop CS3 (just Photoshop, not the full CS group) and Final Cut Express 4--they meet my needs and cost significantly less than the full Pro versions. I haven't seen the need to spend hundreds more just to get the, what, CS7 now? To tell you the truth, I don't care what platform I'm using, I'm not willing to spend massive cash on apps where there are less expensive--admittedly less feature rich--alternatives that do the same job. I don't even use Microsoft Office because of the cost. One thing I do know is that many, if not most, of the documentaries I view on channels like History and Science are edited in Final Cut--almost always going through initial editing right in the field. Certain features of Final Cut are rather complex--not easy to use when you're working on battery power and need to work quickly. I think that the new version simplifies and streamlines many of the more tedious, time-consuming detail work so the editor can concentrate on the production of his video. I will admit I'm not happy that I won't be able to use it on my current Mac, but then my newest Mac is now four years old and nowhere near ready for replacement.

Alan
Alan

Many years ago in the dawn of the personal computer ALL programs where written for the Mac platform first. Yes the high end there was Unix with companies like Silicon Graphics, that has since gone bankrupt. Apple was leading the world as a computer company. Mac world was so big that there where few venues in the country that could hold all the companies that wanted to show their software on a Mac. If you had decided that the PC was the way you wanted to go it was months or years before a company would port their software cross platform to a PC. Fast forward to today Apple dropped out of Mac world due to lack of interest, it can fit into most hotel convention centers now. Viruses where the exclusive problem for apple in the early days and now not even the hackers really care about the platform. Apple has squeezed all the real innovators off of their platform in favor of the PC. Now again Apple had the De facto lead in the world of smart phones and they chose to rule it with their typical iron fist. They tried to dominate and dictate rules and terms to programmers, users,and even advertisers. This has led many developers to move to Android. The advertisers are following the eyes and new development is following that money. As a developer should I make an app to reach 15% or 40% of the market first, hmmmm not a lot to think about here. Press releases this week announced 500,000 Android activations a day. With the shear number of android devices in the market why are developers going to develop for the Ios first or in some cases at all? Yes history repeats it self Apple will snatch a footnote in history from a position of world domination twice.

greggwon
greggwon

Sure, the iPhone, at the beginning was the defining "thing" that suddenly woke up the cellphone marketplace to the thought of "a computing device having phone services". But the bigger issue now, is the marketplaces for the devices, the software capabilities of the platforms etc. Devices will always be changing, but devices will not be able to compete without software systems that allow them to do so. The conversations about camera megapixels, are something that consumers do know about and care about. It is something that will control sales of phones. But, those megapixel upgrades have to work on the devices with the software and make it possible for a lot of things to work better. The conversation about success really should be about OS updates. One of the key features in iOS-5 will be the new "messenger" app. This app will work like SMS, but will not require SMS "services". I'd guess that there will be some SMS gatewaying going on, but I am not sure about how it will be implemented. Given that this is an Apple only feature, it does suggest that there will be at least one more iOS specific differentiating feature in iOS-5. Personally, I use AirPlay so much now, that I don't know how I could use an Android or other device what did not let me do that. I have speakers and an outdoor wireless AP in the back yard. I flip it on when I go outside to work in the yard or just lounge on the porch. I can pull up the iPod app, select something to listen to, press the lock button and put my phone back in my pocket and go about my business.

Vulpinemac
Vulpinemac

While I do agree that it's much too early to call iOS the winner, I also believe it's too early to give that award to Android. Worse, we now have HP's WebOS TouchPad and smartphone package coming along with even better interoperability between devices and the Nokia/Microsoft partnership trying to bring WP7 into the game. As yet the field is wide open and will remain so for at least the next couple of years or so. That said, I think Donovan's perception of iOS's limitations are more its strengths rather than its weaknesses. Android is quite obviously meant to be a complete stand-alone device to replace a laptop for those us use them as productivity tools on the road. I don't fault the concept, but I honestly feel that it's an effort to put too much load onto a package that has too few resources to handle it. It's like taking a Honda four-cylinder engine and putting a Cadillac Coupe deVille body around it. It runs, but it struggles to keep up with traffic. Instead, I see iOS as purposely designed to supplement the desktop/laptop, not replace it. Like adding that extra hundred horsepower to the Caddy's big V8 when it's really needed. We've already seen that Apple is linking many of its own software packages across the iOS/OS X platforms and with iOS 5 and iCloud Apple claims the ability to seamlessly synchronize files between devices without the need for memory expansion cards or other manual file manipulation practices. In other words, I'm seeing Android and iOS as two completely different methods for accomplishing the same tasks. Which is better? I'll leave that to personal opinion.

mcquade181
mcquade181

Donovan, You obviously have not looked at the Nokia N8 if you think than the iPhone4 has "the best mobile camera on the market". The iPhone's camera is average compared with the N8 with 5MP vs 12MP and a quite poor LED flash vs a xenon flash. All photographic reviews of the N8 that I have read all seem to agree that the N8 has by far the best mobile camera on the market. The iPhone has better camera software than the N8, and that is the essense of the iPhone as a whole - great, easy to use software in a package with average hardware.

Slayer_
Slayer_

For personal use of course, not a server. I want to see the specs of it so I can compare it to a typical PC.

dcolbert
dcolbert

I can't quite put my finger on it, but it speaks to the evolution of what the Apple brand and Macintosh means to end-users. I guess I knew this intuitively after years of being involved to a certain degree with the Mac line - but this really highlights that there is a certain schisim between what Macintosh has meant and been *traditionally* and what it is BECOMING. I'd argue that it is possible Apple almost went broke catering to the TRADTIONAL model of Mac user-base to a fault, to the point of alienating a more mainstream audience to a degree. Apple has moved away from that concept that Macintosh is niche machine for professional graphics, publishing and video applications. The truth of the matter is that the more they've moved away from this, the better the company has done - to the point where they enjoy the most significant share of the PC desktop market they have ever had in their history. I don't think this figures solely in those results; moving from Motorola and PPC to Intel was big, supporting commodity PC components and interfaces better (even Apple specialized ones) - like USB and industry standard video GPUs helped tremendously). Is there a Mac base that correlates roughly to the "hardcore gamer vs. casual gamer" argument? I mean, if we draw an analogy there - the Wii still reigns supreme, even though most gamers unviersally acknowledge at this point the WiiMote novelty basically wears off once you move past anything outside of bowling - but Nintendo doesn't care. Let the "hardcore gamers" fight it out for #2 and #3 on the Xbox360 and PS3 platforms. They're a marginal part of the overall market for gaming anyhow, come to find out. Doesn't the same thing apply here? Isn't this almost the Job's Approach toward iOS limitations? "Let them eat Android if they want... flash... porn apps... whatever else". Anyhow, interesting direction you two have taken the conversation in.

Alan
Alan

Yes I understand that the cost of Apps is sometimes crazy. It does take a lot to write, upgrade and support them. There are a lot of lower cost applications out there that do a great job too. When Apple had to compete on price as well as performance they lost interest. I am still using a system with CS2 as well as a system with CS5. You cannot tell the difference by looking at the finished product. I am riding the Google horse for now and there are plenty of issues there too. I just think they are doing a better job looking after the user and that means me. So as long as they do that is what I will buy for my phones, android powered hardware.

Vulpinemac
Vulpinemac

I lived through that era and the majority of what you claim simply is not true. Apple, not Mac, had about a 50% market share before the IBM PC came out and with Microsoft's tie-in with IBM for the DOS and later Windows, along with Microsoft's licensing the OS to anybody who wanted to build to that platform the PC had an automatic foot in the door to the enterprise. No, the Mac hardly had any viable market outside of the graphics industry for many long years and what market it did attain got eroded away by a very slap-hazard method of upgrading and marketing. Quite honestly it got to the point that nobody was willing to buy a new Mac unless they absolutely needed it because an upgraded model was guaranteed to be announced every three months pretty much all the way through the '90s; the machine you bought today was obsolete tomorrow. Yes, gaming software was made on the Mac first--the graphics capability at the time was superior and the OS was simply easier to use--but PC enthusiasts saw the discrepancy and soon ATI and others were building more game-centric video cards that eventually surpassed the Mac's capabilities. By the time Win '98 came out, almost all gaming had migrated to Windows. Your argument about Apple dropping out of MacWorld due to lack of interest is also false--MacWorld is still a large venue, though smaller now because Apple itself doesn't attend. However, the reasoning for Apple's withdrawal is that they didn't want to announce products based on a third party's schedule--Apple maintains its own schedule of announcements. Even your argument about viruses is false--while viruses did exist for the MacOS, Dos and later Windows invariably saw more viruses more frequently pretty much from the outset because of its dominance in industry, where such disruptions created far more visible effects. The Mac now holds its largest market share ever at over 10% (some analysts claim up to 16%) and that share is still growing more than twice as fast as any competitor and even the PC market in general. The deFacto lead in smart phones used to belong to RIM and Nokia, not Apple. Apple came out with the iPhone and showed RIM that consumers wanted phones that were easy to use, not complicated, procedure-heavy devices that tried to shove an entire desktop computer into your palm. The iPhone changed the smart phone paradigm, but it has never held a majority share of that market. Android has surpassed the iPhone's growth in that market, but the logical explanation is the fact that Android phones, like the PCs before them, came out cheaper and in most cases with lower-quality hardware. Microsoft's problems are now Android's problems and we'll see that continue for the foreseeable future. I'm not saying that Android itself is bad, but first-hand comments I receive from Android users I personally know say that they love the price, but not the product. Up to now, Android is still a platform in the making, not a mature, self-sustaining one. Personally, I consider the quality of the average Android phone today as roughly equivalent to the original iPhone, even though it has features roughly equivalent to the current iPhone4. The problem is, as long as Androids are marketed at 2-4-1 or BoGo discounts, you're looking at manufacturers pushing the cheapest hardware and razor-thin margins. Apple may not be selling as many, but their quality is such that the people who do buy tend to like and respect their phones more. Not all that many developers have 'moved' to Android. Sure, many of them develop for both iOS and Android, but they also report that they make more money off of their iOS versions than they do from Android due to pirating and simple cheapskate outlook of most Android users. Android users aren't willing to pay as much for a given app unless it's a real blockbuster app like Angry Birds. What good is selling to 40% of the market if you only make 50% of the cash that you receive from iOS's 25%? Footnote in history? No, Apple will continue as it has done--set the paradigms that others follow. Others may do something first, but Apple has ever been the one to make it truly popular. Apple has and will show how something is done right.

dcolbert
dcolbert

I've installed SipDroid and associated with with a free SIP service to be able to make VoIP calls over WiFi on a non 3g/4g wifi only tablet using my Google Voice number. I can do the same through an alternative solution on my Windows Netbook. I've investigated this functionality extensively on iPad. If it can be done, it must require a jailbreak. As for what you describe about Airplay - it sounds like you're locked into a modern version of Appletalk networking. The new "messenger" app sounds like Skype or Trillian or any of a dozen IM programs... if it is true SMS to phone service, thouse are already widely available for both Android (TextFree exists for both iOS and Android, and Google Voice offers free SMS as well) - so an Apple branded alternative is welcome, but a Johnny Come Lately. But at any rate, apps MATTER, but there isn't an Apple app or an Android app that is going to make or break one platform to the exclusion of the other - except on a case-by-case basis by personal preference of each individual user. Neither platform has a killer app that is going to make one a NES to the other's Sega Master System. Going with *that* analogy, what you've got is an Xbox360 to a PS3... one may be slightly in the lead, but there is a basic parity, each has enough of the market to survive and continue to compete and the next round may reverse those fortunes. An Android will let you do that. DNLA and other streaming solutions exist that will achieve AirPlay results. Worst case, it might take a little more work, be a little less reliable, and need a lot of polish, but it will deliver a LOT more freedom - even now, for your efforts. Which may or not be the factor you decide one way or the other, based upon. I don't think it *makes* a difference. Apps, Device, none of it... the market has a clearly well defined divide that is going to maintain a lot of parity between the two for the foreseeable future. For awhile now the "It is all about the apps" argument has seemed off-target to me. It isn't all about the apps - the apps are mostly the same. I can play Angry Birds or install Dropbox or Evernote on just about ANYTHING now - and they're all mostly the same. There are LITTLE differences, but those little differences don't seem like enough to give A a big enough boost over B to leave the other one in the dust. No?

dcolbert
dcolbert

I think the title says it all... I think the analogy continues with Android being this era's PC of mobile computing devices (and true to form, iOS is this era's Mac). I can't say for *certain* the long-term prospects will all shake out exactly as they did the LAST time around - but it seems possible. But you're right... maybe Apple is really CP/M and Android is DR-DOS (I know, I've changed the example/subject of the analogy suddenly... ignore that)... and it'll be something totally revolutionary that ends up being the dominant platform for these devices. Way too early to tell. Really, my whole argument in the thread was this: Donovan's perception of iOS's limitations are more its strengths rather than its weaknesses. when I said, "pick your poison". It is Order and the Empire (Darth Vadar certainly would have used an iPad) or chaos and the rebel's revolution (What would interface with your X-Wing? An Android, for sure. RD-D2 might be able to jailbreak an iPad in a pinch).

Vulpinemac
Vulpinemac

The Nokia N8's camera is good, but not just because it has twice as many pixels; the iPhone's 5MP camera beat out many of the competition's 8MP models due to a superior lens. The N8 carries a Karl Zeiss lens which is it's real advantage. But that's beside the point. The N8 is effectively the last of a dying breed--I don't believe Nokia intends to continue with their in-house smartphone OS any longer. However, Nokia has already discussed (though not officially announced that I know of) another model that appears to be an N8 using WP7 along with at least a couple other WP7 products. It's the OS--software--that's going to make the difference, not the camera. Besides, the article is not only about phones but also tablets. I'll grant the iPad 2's cameras are junk, comparatively speaking, but the rear-facing camera doesn't need to be better to perform the job of Augmented Reality. I agree with a lot of the Android fans that it simply looks ridiculous holding up a big tablet to take photographs.

dcolbert
dcolbert

I can guarantee you there is an Apple Tax on that device. There are people who will trot out specs on a Hyundai or a Honda or some other relatively affordable car... and compare it to the specs of something like a BMW and say, "See, you're paying for that propeller badge". They're right, to a point - but frequently they haven't driven that Propeller-badged car that costs a premium over a typical car. There is a somewhat intangible quality there beyond the prestige of the badge that makes a BMW different than a Honda. We can break this down further. The actual PARTS of a BMW... many of them are just off-the-shelf automotive parts - distributed by a global network of automotive part makers. The argument could be, "Your BMW is made up largely of the same commodity parts as a Ford Focus or a Toyota Tundra". This is basically true. But the art is in how you put those parts together, how you balance and fine-tune the experience, the interface and package you build around it. Is the design elegant, efficient, oriented toward the driver? Apple users would argue that Apple gets the art of assembly and design excellence to bring commodity components to a higher level, in much the same way as a respected European sport sedan maker. I'm not sure I agree - but I can see the analogy.

Alan
Alan

You make some good points. on my website last year Mac OS represented about 8 to 9 % if my visitors. last month 13% and today 18%. Some days I have had spikes but today was notable with no explanation yet. My feeling is that as the pros leave the fold the masses will follow. So far I have been dead wrong on this and the market has told us that. But I don't think they have been challenged the way Google is doing it before and the field is changing. I put in my buy order for Apple stock @ $20.00 anyone think I will get it? Have a great day All

dcolbert
dcolbert

The Apple II line had a brief position of dominance, mostly because they gave away tons of the II line Apple's to education and schools. They were outrageously expensive, and successively Atari and then Commodore just decimated the Apple II line. This is the 8 bit era. Apple certainly did become a footnote, except to long term Apple devotees, during this era. Atari and Commodore were where the rivalry existed right through the 8 bit era and into the 16 bit era, when the IIgs arrived still-born in a world of Atari ST and Commodore Amiga systems that ran circles around Apple's efforts. At that point, the Mac was a black and white box that looked like an overgrown Vectrex game console. There were VERY few games. MacPlaymate and Glider were the big hit games for the Mac. I'm sure Zork played really well on a Mac 512k (not sure if the original Mac 128k had enough memory for an involved text only adventure). During this era, games were being coded for 3 platforms. The PC and Amiga/Atari ST (well, and 8 bits... the Commodore 8 bit line was still enormously popular. Apple II was stagnant, inlcuding the IIgs, and Atari was nearly bankrupt and their 8 bit line had very little traction). But the one distinct disadvantage roundly perceived about Macs was that they had hardly ANY game support. This would remain largely the case well into the PowerPC era of Mac, and really, no serious gamer buys a Mac, even today. I agree with most of the rest, Vulpine... except: If I can deliver a phone that delivers 75% of what an iPhone does, but I can give you TWO for FREE every two years with a 2 year contract... the iPhone model is inevitably doomed on SCALE. They'll be a niche, again. Who cares how well built, how reliable, how long lasting a *commodity* is? We've already seen this with countless devices, most especially WIRED phone handsets. After deregulation you could buy an incredible quality pushbutton handset from the Big Bells for HUNDREDS of dollars (which adjusted for inflation - would be like paying $900 for your phone, now). They were built like tanks and lasted forever. And nobody bought them. They bought far cheaper phones that delivered many of the same things at nearly the same level. Sound a little tinny? A little static? Keys a little mushy? Who cares. It was $9. If Apple is trying to compete on quality in a market that is making disposable products, they've got a long term strategy problem - especially if Android can continue to give nearly the same quality, the same apps, the same features... in devices that are free or nearly free. Just my opinion on that...

greggwon
greggwon

The points that you enumerate are about the OS, not the device. The "tech" things will capture certain market share, like camera megapixels, physical characteristics etc. But, those are still short term investments because of the 2 year "contract" business. It is to everyone's advantage to get a new contract and and a new device every two years. If you don't, you are paying extra money to the phone company instead of paying off a new device. Regarding AirPlay vs DNLA. AirPlay provides the best, seamless, just works experience. Look at the Apple Airport Express for example. You just hook up your speakers. The Apple TV provides access to the speakers connected to your TV, or if you use the optical audio interface out the back, you can go directly to your stereo system and not involve the TV or other HDMI connected outlet. It's those kinds of devices that are the real difference in my view. It's all about having each of the pieces working together with a minimal investment. Both devices are $99.00 each, and with the Apple TV, you, of course get a good NetFlix interface, as well as video AirPlay. My friends can literally walk in the house with a movie on their device, and just play it from their pocket over the wireless network. The "just works" part is a big deal for many people. Sure, others want to "play with the details" or "mess with the technology", and that's okay by me. It's what keeps things changing in the economy as new products or ideas are brought to market. But why spend big money to solve a problem that's already solved? Especially when the solution is just plain "affordable".

Vulpinemac
Vulpinemac

"I think..." I admit that this is opinion, even if it is based on deduction from a number of recent historical events where were discussed elsewhere.

dcolbert
dcolbert

Vulpine, I wasn't going to respond to this thread - but I'll chime in. I agree with you here - and if we read back into the original article, I actually used carefully qualified language for this part of the discussion. "The iPhone 4 has maybe the best mobile camera on the market; no other handset has really delivered a complete smartphone platform with such nice camera optics. It isn't just having an incredible camera, but having a complete and compelling smartphone package behind that - including general apps *and* apps that leverage and enhance the photo-taking capabilities. The iPhone is almost inarguably superior to everything else out there - in this regard. Kind of a tricky way for me to word this, but I did it on purpose, and we already all know I pull world-play like this with the way I state things. Good thing I'm just a hack blogger rather than a political speechwriter. ;) As you point out this was also just inteded to illustrate why someone might pick one of these two OS platforms over the other. If you're looking for Camera and Movie optics and you're considering iOS or Android, iOS has the advantage, in my opinion. As you point out, the larger article is about the tablet platforms. With that said, there have been times when I had the iPad out, and something presented itself, and it would have been nice to have just been able to go to the camera on the iPad and take a movie or a picture. Maybe my phone wasn't nearby, or maybe I didn't have someplace to set down the tablet to take out the phonecam to take a picture. It augments the device, but it isn't necessary. There have been few enough times where this has been a problem that it wasn't a *necessary* option with a tablet - but it is still nice to have. Additionally, as you point out, for other apps, augmented reality, barcode and tag scanning, and some other default smartphone apps, a camera is kind of a pre-requisite. But I really don't want the conversation to sidetrack into a passionate dissection of this one minor part of the overall thesis of this blog, either. :)

Vulpinemac
Vulpinemac

... not subassemblies like hard drives and video cards. As I said, I used to work for one of their suppliers and I clearly remember having more than one shipment to Apple returned due to too many out-of-tolerance components in that shipment. I was the one who had to calibrate the testing and production tools. We had to build a faraday cage around the testing room and isolate ALL forms of electrical interference. The only reason Apple didn't break our contract was that nobody else was doing any better and most were doing far worse at the time. I won't argue that using subassemblies like drives, video cards and other line-replaceable parts can't be as well controlled, but I think even you realize that Apple has a habit of punishing the manufacturer of such components by going to their competitor until that competitor fails to meet their standards. What I'm worried about now is that with the iPad and iPhone such a big seller, Apple is having to develop multiple suppliers for parts like screens, batteries, etc. This could lead to a similar problem that developed with Microsoft's OS where the hardware quality begins to affect the OS's reliability.

dcolbert
dcolbert

My technical background starts out in retail sales of PCs. The Mac 128 and 512 were still viable systems selling used (and not inexpensively) at the start of my career. I've been repairing all manner of consumer PC hardware for that entire time - The idea that the components in a Mac have ever been *significantly* different than commodity PC parts is largely a myth. At one point, a long time ago, Mac had SCSI drives, which were more expensive (but not fundamentally BETTER than a non-SCSI drive as far as the mechanical components of the drive were concerned) and used Trinitron tubes in their CRTs (which were arguably among the best CRT components in the industry at that time). But during the "Performa" era of pizza-box Macs - this approach was totally thrown out and commodity PC parts became the norm for Macs. Even then, from the beginning, there was NOTHING in a Mac that you couldn't replicate with an over the counter *quality* commodity PC component. From the beginning you could build yourself a budget beige-box DIY PC or you could build yourself something FAR more expensive from vendors and manufacturers who had much tighter tolerances. When you look at an ingredient list and the ingredients say, "Corn Syrup or sucrose" it is commonly translated to mean, "We can use whatever we want without significantly altering the user experience". When you open a Mac and it has a Toshiba or Mitsumi optical drive, or a Fujitsu or Quantum drive - then you're not talking about tight tolerances... you're talking about realities of supply chain, distribution and costs. Worse yet, unless you get something like a bad batch of capacitors made from a stolen Japanese electrolyte formula, even most generic commodity electronic components are going to outlive their useful effective life. These Intex metalframe pools... an 18' diamater x 52" deep above ground costs about $600-700 retail. A doughboy or other hard-sided above ground pool this size retails for several thousand, minimum. The swimming experience is about the same - and with proper care, the cheap Intex pool will give you the exact same swimming experience. And if it breaks, you can buy maybe 6 Intex pools for the cost of a single traditional above-ground. PCs are similar to that. Finely engineered vehicles with tight tolerances are one thing. On a PC, the benefit is difficult to justify when I can replace 2 or 3 cheap DVD-ROM drives for the price of one Apple branded "Superdrive" - and am unlikely to ever need to replace it once, anyhow, during the useful life of the product. So I'm not sure what it is better called - a myth, or a scam. Maybe a little bit of both. There just isn't that significant of a difference. Seriously - there is a reason why Lenovo and Sony are generally in the same price point as Apple products. Because ANYBODY can buy and build a machine using slightly more expensive components. It just isn't a good excuse for ANY Apple argument to try and suggest that Macs are somehow inherently *better* built than PCs - in my opinion.

Vulpinemac
Vulpinemac

... then this thread can be closed. Back in the mid-'90s, when I was building PCs for clients and still using a Mac Performa AIO machine for myself, I looked into trying to become a Mac clone builder. The response I received from Apple way back then was that I had to meet certain hardware requirements that drove my acquisition and assembly costs close to Apple's own retail pricing. Having worked for one of their suppliers previously, I fully understood their reasoning. Yes, there is a hardware/software integration factor, but it's not quite in the manner that some people believe; OS X could probably run on commodity PC hardware, but unlike Windows, they simply won't support the lower-grade components that other manufacturers select as cost-saving/profit-enhancing steps. Where most manufacturers rely on discrete components like resistors, capacitors, etc. with a ??5% tolerance from its nominal value, Apple insists on 1% tolerance or tighter. Apple was known to return an entire shipment if a random sampling had more than a few out-of-tolerance components. While I realize that Apple now relies on Foxconn for their assembly and ASUS (I believe) for their motherboard assembly, I'm quite certain that Apple still monitors the quality of the discrete components just as tightly as they used to. Quite obviously, the fewer variations in a system's hardware specs, the more reliable the system as a whole becomes, and the less risk of hardware/software incompatibilities. Had Microsoft insisted on high-quality hardware for Windows from the outset, it wouldn't have developed such a horrendous reputation for driver issues that cause so many of those BSODs.

dcolbert
dcolbert

For a typical user who is really solidly comfortable and invested in one platform or the other - then dual boot or VM probably is "good enough". The irony here is that in that case, unless you want to play around with Hackintosh solutions... if you WANT a Mac environment, you've got to have Mac hardware because the OS is unreasonably (IMHO) locked down to run ONLY on Mac equipment. That is... I can run a Mac and run Linux, and Windows as dual booting or VMs... But... If I were running regular PC hardware - I could run either Windows or Linux as a native OS... and run VMs or dual boots running the other... heck, I could even run Android in a VM... but I couldn't run Mac, either dual booting *or* as a VM. If we're going to get into the concern about Apple hardware/OS integration delivering a better/unique hardware experience - let's suggest that Apple could support a "developer/tech professional" distribution (Like Microsoft MSDN/Technet) that would provide device agnostic versions of OS X that could be run in VM or dual booting on regular PC equipment. Apple just doesn't WANT to do something like this. They want the hardware sales too. It is kind of greedy, and kind of slimy, in my opinion. It illustrates what I don't like about Apple's *business* methods.

Vulpinemac
Vulpinemac

For many years I ran both a Windows box and a Mac side by side. Even now, there are times when I think I may return to this modus operendi, but my need is so rare that it's not worth the added expense. I have one Windows-only app that I use on any kind of a regular basis and I have a way to make it run on OS X without pulling up a full VM for it. It meets my needs and didn't cost me a thing to make it work.

dcolbert
dcolbert

Vulpine, I agree with you completely here - running Windows on a Mac via bootcamp (or, as the sole OS, which some people actually do) is running on bare metal. Dual booting is a similar hassle to other options of multi-purposing a single machine. I find that I hit limitations with any of these schemes. I've got my Mac Mini set up to dual boot into XP. But I actually have an XP machine on a KVM sharing the keyboard, mouse and monitor with my Mac Mini. When I need XP while I'm using the Mac Mini - invariably I just boot the XP machine. The ability to switch back and forth between both machines, have both machines available on the network at the same time, the shared resources of both machines available - is a tremendous benefit. With dual booting of any kind, I have to restart the machine, go through a lengthy boot process (and sit through it to make sure that I tell the PC *which* OS I want it to boot into) and then, if I find a reason I want to go back to the OTHER OS, I have to repeat the process. Now, my Mac Mini is a generation back - but I've upgraded it to 2 GB memory and a 500 GB hard drive - so it isn't so much about resources. It is just about hassle. With a dual boot machine - if I'm ripping an iso to MP4 format for iPod playback or burning a DVD, then I really have that machine pegged out for processor. I can't be doing OTHER productivity tasks when the machine is dedicated to that role (something I prefer to do in a Windows platform. I think the tools for these kind of tasks are more mature and robust on Win than on OS X). Having two physical machines, I can have one machine doing a "heavy lifting" task while still using the other machine to pass the time, write a blog, update my social networking sites... without the concern that consuming memory and processor cycles may lead to a coaster. Ultimately - dual booting, VMs, all of those things are great if you want to get your feet wet, if you want to play around in a scratchbox - if you've got some sort of casual need. But if you've really got a need for *both* platforms, there is simply not subsitute for having two machines running the platforms you need *solely* as bare metal. I've got JoliOS and Win 7 on my S10, as I mentioned. It is frequently a headache. I'll reboot to go into Windows, get caught up on something else, look over, and it has rebooted by default back into JoliOS. Anyone who has a dual boot environment knows exactly what I'm talking about. I wonder how many man-hours of productivity are wasted globally each year because of dual boot machines being rebooted and going into the "wrong" OS because the user was distracted. I bet the number is *huge*. When it was time to check out Chrome OS (Chromium OS) - rather than triple booting or adding a VM - I grabbed my Eee PC 701 and installed it there. Now - the flip side of this is that in my upstairs study I have: A Win 7 desktop, a Win 7 17" Laptop, an XP desktop, and a Mac Mini. Those are frequently joined by portable systems including a Lenovo S10, an Asus TF101 Eee Pad, an Eee PC 701, and an iPad. Downstairs I have a G4 Artic Silver PowerPC Mac, an older Dell running Ubuntu and another windows machine - and a Windows Home Server. For most consumers, I get it that casual use is all that matters - but here on Tech Republic, I expect more of us are technology professionals who are probably similar to me in their needs. If I were talking to a consumer who wanted a Mac but had some Windows needs - I'd recommend a dual booting or VM solution in most cases. But that isn't the audience that I write for here. My friends who see the tweets about my blogs and articles inevitably respond, "Why don't you ever talk ENGLISH!"

Vulpinemac
Vulpinemac

Simply put, without a VM running, you can't run both Windows and OS X on a Mac simultaneously and Windows performance on a Mac through BootCamp is equivalent to running on bare metal with the same specs. I run Windows 7 on my Mac right now and do a fair amount of gaming in Windows, but I choose not to do any web browsing or web work there; I rely on the fact that as yet there are few malware attacks against OS X and even fewer successful ones. Yes, I do use AV applications in both OSes, but due to the dearth of attempts against OS X, I see far less interference and slowdown on the Mac side of the system. Interestingly, my wife almost bought me a Windows laptop for my birthday to serve as a dedicated gaming machine.

dcolbert
dcolbert

There have been emulation schemes around forever, going back to the Z80 cards that would allow an Apple II to run CP/M software. Mac has been playing with various ways of offering PC/Windows compatability longer than just about anyone else. The truth of the matter is, a single dedicated machine is *always* the best solution if you have the resources to do it. I've got a Mac, it loads Windows via bootcamp. I've got PCs, they've got virtual machines on them that run Linux. I've got an Ubuntu box, it runs VMs (that run other Linux distros). If I'm serious about a platform, it gets a machine of its own where possible. The only DUAL boot machine I have is my Lenovo S10 which has Win 7/JoliOS on it. Dual boot, Wubi, VMs - they all have limitations and headaches that having a dedicated machine avoids. Think Geek really needs to make this subject line a bumpersticker or T-Shirt.

Slayer_
Slayer_

In the world of computers, it is possible to get yourself a car/boat/airplane/spaceship/bicycle/motorcycle/snowmobile/jetski. You just need to spend the right amount of money on the right parts. My previous computer did this perfectly, but its time for an upgrade.

Alan
Alan

You can buy a great car or a great boat. But there are no great carboats. Take a look at the programs you want to use and get the best machine for them. You may miss a few apps but that is better than having them all run soso. Just a thought.

Vulpinemac
Vulpinemac

... then building your own gaming PC is probably your best choice, followed by a high-end Windows box; that way you can use the video card you want. I'm gaming just fine on my 4-year-old iMac with Windows 7 but I am running into the older hardware's limitations what with the new Final Cut Pro using OpenCL, which my video card doesn't handle. I do know the Mac Pro has some capability to install upgrade video cards, but that's somewhat limited as to which cards you can use. Wouldn't bother me, but I know it'd bother a hard-core gamer. (I play Champions Online, LoTRo, DDR, Dragon Quest Origins and a few others on a regular basis.) I'll grant that there are some apps that are Windows-only that I prefer to use, but the ability to run Windows and OS X on the same machine means I don't have to run two separate machines side by side as I used to.

Slayer_
Slayer_

Although a MacOS would be less than useless to me, I know you can install Windows and Linux on Mac hardware. And I like my hardware to last long, but I also have the demands of a hardcore gamer. I don't know if the poor graphics performance benchmarks are caused by the apple hardware or the Mac 3D software, but I want to know what Apple makes for that sort of consumer. For example, I preferably want a computer with a nV580 in it at least.

Vulpinemac
Vulpinemac

... I have 5 Macs in my house, not one newer than 4 years old and all still capable of operating as well as they did when new with no major repairs (outside of hard drive replacement on two of them) and I'd say I've gotten my money's worth out of them. My 10-year-old white G3 iBook runs Tiger and subsequently "Classic" OS apps up to and including Photoshop 2.5 (released in '93 I believe?), my G4 Mac Mini serves as a DVR in my office, my 6-year-old first-gen Intel MacBook runs Snow Leopard and my first-gen aluminum iMacs are ready to move up to Lion. As yet, I have not had to pay for any repairs on any of them. All of them are capable of multi-media processing and the iMacs do so regularly even up to using Final Cut for video creation. Donovan's comparison to the BMW is quite valid. Yes, obviously it is possible to have a breakdown; usually the motherboard, designed by Apple, is not at fault; usually the breakdown is not caused by overheating of the components even though the fan runs slowly enough to be inaudible in all but the quietest of rooms. Industrial design is more than looks--it's the functionality, the accessibility and air flow through the computer that makes a Mac what it is. For some that's hardly worth considering but for others it means less maintenance and less overall cost over the life of the machine.

Slayer_
Slayer_

And of course, where are those machines meant for multi media processing?

Vulpinemac
Vulpinemac

I don't see Apple killing off their high-end products but rather the more specialized products like their blade-style servers and specific server versions of OS X. The Mac Pro still sells and has, I believe, just about the greatest RAM capacity built-in of any non-server desktop (while Windows Server may handle some 256 Gig of RAM, how many hardware units can provide it?) and the Mac Mini as a server is relatively inexpensive and offers a compact and stackable alternative to the huge server racks for smaller businesses. On the other hand, if you look and listen to recent reports, Apple's hardware is making inroads into enterprise desktop computing as well as effectively competing with the Blackberry environment for enterprise mobility with both its iPhone and iPads. As the desktop computing environment changes (and it is changing), we'll probably see Macs come into their own in enterprise as mobility and application servers while more traditional PC servers will become live data storage devices. So, "No" to professional grade devices is a misnomer since Apple still targets that environment; they're just approaching it from a different angle and it appears to be working.

greggwon
greggwon

The cuts in the pro industry are the real indication of Apples focus on a much larger customer base. Look around at where Apple is selling products now. The Pro line customers have not been able to sustain a large enough market for Apple to be able to grow and profit. This is exactly the same problem that Sun Microsystems had. They thought that they could sit around and sell high end servers to big companies. That's a 100,000 customer market! While Microsoft and others were selling to a 1,000,000,000 customer market. The 3 orders of magnitude change in customer base results in a 6 orders of magnitude change in money flow. Apple will likely release new Pro product refreshes in the next year, I think, or it will retire those products. It's just not a productive market to be in, and the total customer base is so small, that the price points would be huge and probably unattractive to the customers anyway.

Alan
Alan

Yes I was there too. I spoke at several Mac Worlds. I had worked very closely with Apple, Adobe and Radius on the first digital video systems. I taught digital video editing for years @ the Apple market center in NY also. I have my Second Android devise and love it. I do not find a quality issue and the apps are outstanding. Many of my Ifriends are switching due to the Apps not price. As a content owner and producer it is difficult to make money on either platform. As far As apple leaving its base it is not fiction it is fact. They have abandoned their Pro graphics and Video editing Base in favor of the masses. They have a lot of people that still believe that the pros use it then we should. As more people give up the CoolAid and start to really compare what the have and what they want they give up on the Apple sauce. Adobe still has the strongest professional graphics and editing software in the world bar none. Apple tried to undermine Adobe???s leadership on flash. Yes there are some issues there but Apple thought that they could force the entire industry to change but they have no political capital with the rest of the industry. They tried to push the whole industry with FireWire. Remember that whole disaster. After Firewire was the established standard Apple demanded that their patents where worth more than all the others. Apples greed killed the standard in favor of lesser technology, USB. USB did not have to answer to apple so it became the new standard. Or so the story goes. Recently I have been also using Sony Vegas pro for editing and have had great results also. There is no question that Apple has great design. There is no Question that they make a great product for many people. There is a long track record of marching to their own drum but I am not sure that they can maintain leadership in anything for any length of time. They are like a kid with ADD. CreativeCOW.net posted this article the other day by Helmut Kolber http://library.creativecow.net/kobler_helmut/FCP-vs-Premiere-Pro/1 For those that do not want to jump over to his post I will give you his introduction here "I've been a Final Cut user since 2000. I've written three "Final Cut Pro for Dummies" books (plus one about Final Cut Express). I've written fairly glowing reviews of multiple versions of Final Cut for multiple Mac magazines. But since 2010, I've been contemplating my escape from Planet Final Cut... Yes, this was before the Apple Final Cut Pro X debacle/disaster/catastrophe/suicide attempt. This was before Apple, in the wake of FCPX, arrogantly pulled Final Cut Studio 3 off the market, preventing long-time customers from buying additional seats. It was before Apple abruptly killed off Final Cut Server (without any public comment), wiping out massive amounts of money and time that trusting customers had thrown into it. Yes, well before all of Apple's recent shenanigans, I started to sense that Final Cut, along with all of Apple's professional apps and gear, was slowly being strangled to death. Here are a few of the harbingers of doom that caught my eye over recent years: Apple took nearly 2.5 years to upgrade Final Cut Studio from version 2 to 3 (and v.3 was only a moderate upgrade at that). Until then, updates had come at a much more aggressive pace. Apple cancelled the popular Shake, promising to replace it with a new tool that never came. Apple got lazy with its Logic Pro app as well, letting development creep along with an upgrade about every two years. Apple stopped updating the Pro page on its web site long ago. There hasn't been a new item posted in almost two years: http://www.apple.com/pro/ Apple took more than a year to fix a glaring Final Cut 7 bug that made its Close Gap command unreliable. To break a core Timeline feature like Close Gap and not fix it for 14 months was offensive and inexcusable. Apple cancelled its Xserve RAID then its Xserve hardware. Apple started taking longer and longer to release Mac Pro workstations, and absolutely phoned in the latest upgrade last July. 511 days in the making, the newest Mac Pro was one of the most un-inspired hardware upgrades I've ever seen from Apple. Apple pulled out of industry trade events like NAB. Multiple rumors (and confirmation of rumors) of significant layoffs in the Pro Apps division. Multiple rumors that Apple was trying to sell off its Pro Apps division. Take just a few of these and maybe they don't add up to anything. But take all of them together, and it's a real sign of Apple's low-to-non-existent priority for professional media. Yes, the writing has been on the wall for quite a while, and by 2010, I reluctantly began to read it. Late last year, I started to look at the two clear alternatives to Final Cut...." This is just my experience that over the years has made me not trust Apple but many others are moving that way too. Without their base when Apple is no longer cool and just expensive what will they sell? Bottled Water?

Vulpinemac
Vulpinemac

It insures related hardware offers the best possible experience. Look at any major audiophile component system now and it probably includes a proprietary fiber-optic or other transfer connection between same-branded components like tuners and amplifiers, receivers and playback device, whatever. Even Samsung TV and Samsung Blu-Ray players have a proprietary software connection to make their combined use as simple (supposedly) as using the TV alone. However, as you clearly pointed out, they don't try to force others to adopt that in-house standard. Firewire was an attempt to compete with USB and for almost a decade it did very well for video transfer from a video camera to PC--far more reliable than USB in my experience. However, with video now free of tape reliance and raw file sizes in the low Gigabytes compared to editing file sizes in the tens or even hundreds of Gigabytes, USB's relative instability is less of a factor. (example: Copying 1.5 hours of video from a tape-based camcorder took 1.5 hours to download to a PC. Almost invariably the USB transfer failed about half-way through the transfer while FireWire never failed for me. Raw transfer now takes less than 5 minutes and if the SD card is connected through a reader it tends to take less than one minute to transfer the same 1.5 hours of video.) Based on the changes in recording technology then, Firewire's enhanced speed is no longer needed for the purpose the industry chose to adopt it. True, FireWire also serves as a significantly faster and more reliable connection than USB for external hard drives, but with eSATA proving even faster and wireless power and transfer technologies, I see a slow death for USB as a communications protocol. I could predict that by the end of this decade that rat's nest of cabling on and behind your desk will be obsolete and mostly unused. Based on certain recent patent applications by Apple, the Mac or its descendant may be at the forefront of that change. My point is that whether a standard attempt succeeds or fails, to me the ethical and logical method is to let adoption be fully voluntary unless a standard is absolutely necessary for interoperability between competing platforms. The International Standards Organization has the responsibility to recognize and set those standards when too many proprietary systems interfere with efficiency. To the best of my knowledge, Apple has never gone to the ISO and said, "We want our connection to be made an ISO standard." A certain other brand however, has gone to them and said, "We want our software format to become a world standard." Apple's current HDMI connector was approved and licensed by the HDMI patent holder for Apple's use. Apparently third-party brands trying to make similar adaptor cables didn't license theirs. Apple doesn't appear to be trying to force an international hardware standard, only enforcing an in-house standard for their own purposes.

dcolbert
dcolbert

The first, obvious observation is that Microsoft's lock-in has been software, whereas Sony and Apple play a hardware game. I think your observation is easily twisted, too, regardless of where your perspective lies: Apple brings pricey, extraneous proprietary interfaces to their systems at every turn - always with the justification that their solution "enhances" the experience in some way. They never say you can't GO with another platform that doesn't have these "advantages" - they simply say if you want to play in their playground, you'll have to deal with their superior solutions that are, unfortunately, not industry standards. Of course, if everyone was smarter, they would give in, and everyone would be able to enjoy the fruits of the superior, propietary, Apple way. They've failed more often than not over the last 30 years where they've tried to do this. From ADB to AppleTalk to countless generatios of funky little adapters to make an industry standard monitor, tv, or other visual interface plug into a Mac. So has Sony. And... if you want to draw an analogy from this behavior to Microsoft's attempts to do this with software subsystems (which I don't think is unreasonable) - then Microsoft has failed more often than not when THEY'VE tried this kind of thing, too. But in any case, the argument that works for, or against, works regardless of if you're talking about Apple, Sony or Microsoft. I'm just saying Microsoft gets a lot of heat for this... when Sony and Apple have arguably been more flagrant about it. Perhaps it is just that they haven't been very GOOD at it that is the difference. Let's hope Apple keeps up that trend with the locked-in approach to tablet computing that we see in iOS devices. ;)

Vulpinemac
Vulpinemac

"Sony... Apple... Have constantly tried to lock users in to proprietary formats and standards - they're far worse about this than Microsoft has ever been." While I will agree that Apple and Sony try to lock users in, I don't agree that they're worse about it than Microsoft. Just look at all the allegations about Microsoft trying to force their OOXML code into becoming a global standard, not just a user choice. Just look at Microsoft's historical issue with trying to force its Internet Explorer to become the global standard web browser and getting slapped down for how it tried to enforce the standard. There's a big difference between proprietary formats within a brand and trying to force all brands into one. It's one thing to say, "If you want to play in my playground you'll play by my rules," and another to say, "I don't care who's playground it is, these are the rules you'll follow." Apple's system works and seems to work more reliably and seamlessly than any competitor to date. Yes, the competition is out there but you don't see Apple trying to force their rules on anybody else. If they want to build a similar system, Apple isn't the one saying, "You can't do that."

dcolbert
dcolbert

Have constantly tried to lock users in to proprietary formats and standards - they're far worse about this than Microsoft has ever been. Microsoft has just been wildly more successful with this model, primarly because Microsoft doesn't try to control formats as tightly as Apple and Sony. AirPlay requires buy-in to the Apple ecosystem. You've got to have an iOS device, you've got to have Apple TV, and you've got to have iTunes. It is easy to deliver "seamless interoperability" when you control everything from front to back. You're putting a lot of faith in a company that has proven time and time again that they will abuse you as a consumer the minute you give them the capability to do so. Meanwhile, the open standards close the gap, day by day, continue to evolve, and begin to deliver seamless interoperability that can approach what Apple is delivering, at a huge cost savings, with incredibile flexibility in which vendors/manufacturers and devices you build your system on. Apple's walled garden is an advantage in delivering what you're talking about, inferior at delivering at what I'm talking about. In almost every case, with almost every technology - historically consumer markets have eventually rejected the Apple model and embraced the opposite. Apple has traditionally resisted and remained a niche player. The truth is, at this point they're huge, but that doesn't mean they've only bit off a niche of this market we're talking about. This market may be less like the traditional PC and more like an appliance, like a television, telephone, refrigerator, or water heater. That is - there may be a WHOLE LOT of market for Apple to give up by insisting on sticking with their fairly closed, non-interoperable model. This isn't about playing with the details or messing with the technology. This is bigger than that. It is about a level of flexibility that introduces an economy of scale and creates a competitive market that benefits the consumer. DNLA has the capability of producing this - Apple's AirTalk is far less likely to achieve that goal. I think that is where we will eventually see the consumers decide.

Editor's Picks