Data Centers

Do cloud computing and parallel processing lack use cases?

In response to a post on ZDNet by Larry Dignan, Justin James asserts that there are no killer apps for cloud computing or parallel processing. He also discusses why cloud computing and parallel processing have not been bigger trends.


Several weeks ago, Larry Dignan wrote a post on ZDNet entitled, "Yes folks, the cloud and parallel processing need killer apps." Normally, I tend not to pay terribly close attention to industry pundits when it comes to the programming industry, simply because I find that the "latest and greatest" that these folks are really interested in writing about simply is not the reality for your typical developer. These pundits are a great resource if you want to know what might be mainstream five years down the road, but most developers are too busy to worry about anything other than the here and now. In this case though, Larry really seems to have his finger on the pulse of the industry, and I wanted to expand on his thoughts. (In the interest of full disclosure, I feel that I should note that Larry Dignan is Editor in Chief of ZDNet and Editorial Director of TechRepublic.)

First and foremost, "the cloud" and parallel computing are both in approximately Year Three after becoming in a position to be used on a mainstream basis. In the Spring of 2005, the first Intel dual core CPUs (the Smithfield models of Pentium D) hit the market. In late 2006, the Core 2 Duo series drove the price/performance curve in a fantastic direction, delivering unbeatable dual-core performance at prices that were considered cheap for single core performance a few months earlier. Cloud computing also became broadly available around this time and, in the Spring of 2006, Amazon's S3 service heralded the availability of cloud resources from mainstream vendors.

AJAX as a technique had killer apps like Google Maps and Outlook Web Access to show developers that learning that technique would be useful. Java and .NET both had provable, obvious value that developers saw, driving their adoption. At this point, hardware vendors cannot seem to really crank single core clock speeds significantly higher without severe heat problems, which is why dual core (and now quad core) architectures are continuing Moore's Law on the motherboard. WAN speeds and reliability are now at the point where vendors feel that there is opportunity in cloud computing.

But as Larry asks, where are the killer apps? Heck, where are any apps, killer or not?

Games still are not making heavy usage of parallel computing (although the AI in them could definitely use it, in my opinion). Graphics, video, and audio editing programs such as Photoshop and Premiere make heavy usage of multithreading. Modern compilers are getting better at using parallel processing to speed build times. Of course, operating systems and network services like Web servers, database servers, and e-mail servers have always had to do a lot of multithreading. See a trend here?

While these are applications that nearly every user is accessing at some level, they are applications that only a small segment of elite developers are working on. And those developers were writing code to take advantage of multi-core machines long before they became mainstream because many of those applications were being run on high-end hardware years ago. Multimedia editors have been using SMP workstations for well over a decade. SMP servers (even x86 ones) were available in server rooms for more than a decade (in fact, I just shut down a Pentium II server circa 1996 or so that had two processors in it). So the folks writing these high-end applications had motivation to write their code to be multithreaded. More importantly, that means that the techniques to write code like this are established and documented, although they are not widely known.

Cloud computing, on the other hand, is a relatively new idea. Frankly, it is going nowhere fast. Wikipedia's list of "Notable uses" of Amazon's S3 is... well... not very notable. Amazon's S3 service gets more press from being down than it does for signing big clients. The other cloud computing vendors that I looked at also seemed to have equally unimpressive track records and customer bases.

So what is going on here, and what kind of killer apps could give these two trends some traction? It is a cinch to tell you why parallel computing hasn't been a bigger trend: There is simply no need for it in the vast majority of applications. Regular readers of the Programming and Development blog know how much I like to write about parallel computing, but I recognize the reality, too. Very few applications bring a CPU more than a few percentage points off of idle. If they do use a lot of CPU time, it is going to be in a database request, at which point it is out of the application developer's hands. Modern CPUs are simply too fast to justify using parallel processing in many cases.

If you do happen to see an application use multithreading, it is usually in an asynchronous pattern like downloading an item with the option of cancelling it; also, it is done for usability reasons rather than for performance reasons. Another place you see multithreading in typical applications is in third-party components like a graph-rendering component. In a nutshell, the typical business developer makes use of resources that may kill the CPU but does not directly write any code that could or should be converted to parallel processing.

Additionally, mainstream OSs juggle requests for CPU time very sanely. I was recently doing some experimentation with parallel computation of Fibonacci sequences. I had my Core 2 Duo processor pegged at 100% on both cores. The MP3 I was playing did not exhibit any problems, and the computer was still perfectly usable. On top of that, the perception is that writing multithreaded applications is very difficult, even though it is getting easier and easier.

Cloud computing's lack of success is even easier to explain: trust. Do you trust that your Internet connection is 100% perfect? Much less than you trust your internal switches. Do you trust that your company's data is safe in the hands of people that you have never met? Much less than you trust your DBA and your system administrator down the hall. After all, you know where they live. and your HR department ran background checks on them. Do you trust that the third-party vendor is really doing the nightly backups that you are paying for? Much less than you trust your in-house nightly backups; you see them take the tapes offsite once a week, after all. Until the cloud computing vendors built up a long-term reputation of being reliable and trustworthy, cloud computing is dead in the water.

I am not saying that cloud computing or parallel processing do not have a place. I think that cloud computing is a good idea for consumer-oriented applications that either act as a redundant copy of data you already have locally (such as using Flikr to publish some photos on your hard drive), or to provide services and store data that is not critical that you always have access to (like Skype). The cloud vendors do have better uptime and backup procedures than your typical consumer, and the typical consumer is less likely to have information that would be catastrophic if it is lost. Likewise, where you will be seeing a lot of parallel computing is in minor functionality; think of applications that get a lot more graphical and a lot more real-time (e.g., Microsoft Photosynth). But it is very unlikely that there will be any applications that are both business oriented and 80% or 90% cloud computed or parallel processed or that show off either of these ideas extensively.

So, Larry, in response to your question, "Where are the killer apps?" I think the answer is: There are no killer apps for either cloud computing or parallel processing. At best, there may be killer widgets. While we may remember the handful of applications that are fully AJAXed, the reality is, most sites employing AJAX use only a widget or two where it makes sense. Likewise, we may remember some super-neat ray tracer or application that magically retains your data wherever you are, most applications will be incorporating these techniques as a side dish, not the main entrée.

Eventually both cloud computing and parallel processing will enter the average developer's bag of tricks, but don't hold your breath.

Related TechRepublic resources


Disclosure of Justin's industry affiliations: Justin James has a working arrangement with Microsoft to write an article for MSDN Magazine. He also has a contract with Spiceworks to write product buying guides. Get weekly development tips in your inbox Keep your developer skills sharp by signing up for TechRepublic's free Web Developer newsletter, delivered each Tuesday. Automatically subscribe today!


Justin James is the Lead Architect for Conigent.

Editor's Picks

Free Newsletters, In your Inbox