Linux

Will Linux own the cloud?

Many people have strong opinions about the efficacy of cloud computing and what it all means. The battle intensifies along the lines of proprietary vs. open source technology. Will cloud computing finally declare a winner?

Red Hat CEO Jim Whitehurst thinks so: "Clouds are fundamentally new infrastructure. Why lock yourself into something proprietary?" he says in an Internetnews.com article. Whitehurst believes that open source technology is the perfect vehicle for innovating in the cloud computing era. And in its article, "Cloud computing with Linux thin clients," IBM spells out the reasons it believes there will be "penguins in the clouds":

The best operating environment for a thin client designed around cloud computing has the following characteristics:

  • Highly customizable
  • An inexpensive or even free operating system
  • All necessary applications inexpensive or free
  • Networking built into the operating system core
  • Small enough to fit into tiny devices
  • Flexible and powerful enough to run full laptops
  • Miserly enough to conserve battery life to a maximum degree

Linux meets all of these criteria. It is taking over in the mobile space, the enterprise space, and the embedded space, including dedicated consumer devices such as book readers and set-top boxes. And with virtualization, Linux can also run applications built for the Windows®, Mac OS X, and other operating systems.

Are these just pipe dreams or is there a big shift coming? The folks at TechBlorge.com suggest that there are some high-anxiety clouds hanging over Redmond in "Microsoft worried by Linux cloud." Somehow, Microsoft is uncomfortable with statements from the Cloud Computing Interoperability Forum (CCIF) like: "The CCIF will not condone any use of a particular technology for the purposes of market dominance and or advancement of any one particular vendor, industry or agenda." Oopsie.

Of course, even the CCIF pulled out of the so-called Open Cloud Manifesto (along with big players Amazon and Google), which somewhat dampened the March 30 launch of this attempt to apply some standards to cloud computing for the purposes of interoperability of technologies. As it turns out, openness sounds like a great idea, yet many companies are still struggling with the niggling question of how to make money off it. As well, there are some pretty strong feelings in the IT community about the very idea of organizations turning over so much power to "the cloud."

Who do you think has the most to gain in the cloud computing era? Will the philosophical goals of open source finally win out and spell the demise of behemoths like Microsoft? Or will the whole thing collapse upon itself, much like the Open Cloud Manifesto?

About

Selena has been at TechRepublic since 2002. She is currently a Senior Editor with a background in technical writing, editing, and research. She edits Data Center, Linux and Open Source, Apple in the Enterprise, The Enterprise Cloud, Web Designer, and...

15 comments
Jaqui
Jaqui

As the others have pointed out, the infrastructure to support Cloud computing is lacking for major use Internet / Extranet wide. Local networks, different story. Apache front ending to Tomkat java application server is well established. PXE booting diskless clients from any GNU-Linux system, with full GUI supplied to the clients, dead simple. [ even easier to implement than rich client network authentication and file serving. ] I saw a Calgary Alberta Canada company a year or so ago advertising network system, up to 10 workstations on one workstation box, done with GNU-Linux. Instead of having to buy 1000 workstations you only need to buy 100. Instead of buying 1000 licenses for software, you have Free Software. Massive cost savings. When you add into the mix that palmtop systems can have GNU-Linux embedded as the os to connect to the network, you gain a massive amount of capability for Cloud computing with a GNU-Linux foundation.

Neon Samurai
Neon Samurai

I remember reading a couple of years back about a school that put a Linux server in then ran the old school computers as dump-clients into it. It worked great for administration and student use and saved replacing perfectly good hardware.

Jaqui
Jaqui

GNU-Linux has the capability to support the distributed environment. the BSDs do also. I'm not so sure about MacOS, but every other Unix does. leaving Windows as the only os definitely lacking in support for the cloud.

Deadly Ernest
Deadly Ernest

If Microsoft moves to a cloud type operation, then they will definitely lose out to Linux there as Linux can offer the same capability to the user a lot cheaper. however a cloud based operation will only work if: a. it's restricted inside the corporate LAN, to ensure proper speed and security; or b. it's restricted to casual users who have no concerns about heavy usage, need for speed, or security - people like students wanting a cheap way to use office apps to do assignments etc. It's no good for the corporate world or people who do NOT live in a big city with hi speed broadband access.

Slayer_
Slayer_

Don't forget how easy IE is to integrate into programs. it is very very likely that cloud based solutions won't directly use a browser, for security reasons, and will instead shell the browser inside another program. This of course can go either way, Windows or Linux however, there are already 100's of Windows solutions available that can do this. So in that sense, Windows is already ahead. Some quick examples off the top of my head: Windows update, a no brainer here. Xfire, uses I believe Mozilla in its ingame browser, works slick, but only for Windows. Prolender and Probe, most won't know these programs cause the company that made them was bought out and its a specialty program, but they are both advanced lending web based programs that run in an IE shell. Ok so it's 10:00 Pm and I'm tired, maybe I will come up with more later. I sadly cannot think of a single one for Linux OS's. If someones got some, please name them :).

Neon Samurai
Neon Samurai

I think it's more a function of the browser to be able to present only the minimal shell. If browsers besides IE don't do it already, it should be an easy function to add. I'm a little fuzzy on what the examples are else I'd make some guesses to offer. Off hand, I can't name specific programs that make use of an html rendering engine behind them but they must be out there.

Slayer_
Slayer_

I have already said it before, but I shall say it again. The cloud is to slow. It is too slow now, it will be too slow later, it will always be too slow. Through a LAN that's possible, but through the WAN, absolutly not. Considering 99% of the world does not have internet speeds in excess of 100kbps down and 30kbps up. People expect everything to be in the cloud. Please show me the point in the near future where we can play a PS3 game over the internet. I mean, to do this you would need a speed capable of loading a new image every single milisecond, it would need to be run on a server that could maintain and output possibly millions of instances of this game, and it would need to be able to repeatedly transfer files. Possibly required more like 20 gigs of game files per session. Does this sound impossible yet? I say the cloud will never be the solution people imagine it too be. It just can't work.

Saurondor
Saurondor

I don't think speed is an issue. Certainly WAN is slower than LAN, but todays WAN is way faster than yesterdays LAN. Back when we were on 300bps using four letter acronyms to express a phrase we never though possible a video conference. Today DSL lines provide greater speeds than LANs not so long ago. Obviously they're not as fast as today's Gigabit LANs, but they allow us to do more over a WAN that was possible a few years back. I don't envision cloud computing as "moving the datacenter to the WAN". But rather leveraging the WAN to provide new and innovative services you can't provide with only your datacenter.

Deadly Ernest
Deadly Ernest

The majority of the people in the world do NOT have high speed access as they're too far from exchanges etc. Australia is one of the most advanced countries re the use of computers and the Internet, yet we still have about 75% of the country is unable to access high speed Internet and still using dial up modems. Recently a rural supplies company almost went broke because someone in the corporate headquarters of the company that bought them out had a whole new web site designed for them by a graphics artist using JavaScript and all sorts of fancy images. It looked lovely and loaded in about forty-five seconds on a 1 GB ADSL access. Internet orders in the next quarter dropped to 5% of the sales in the previous quarter and many regular users weren't placing orders. Someone had the brains conducted a survey of the clients - the universal answer "What hell good to me is trying to order from a web site that takes forever to load. I've only got dial up out here and if it isn't up in less than a minute, I'm gone. I don't have hours to wait for the web site to load." That failure to take into account lack of broadband access cost them millions of dollars of lost sales, sales that they never got back as they were for consumables that went to competitors. After the web site was redesigned to be more usable on dial up, some clients never came back and fiasco cost them hundreds of thousands of dollars more in products that reached expiry dates before they could be sold; they would have been sold at a profit except for the web site failure. Clearly, speed is a factor in many parts of countries in the second world, and many in the third world.

Neon Samurai
Neon Samurai

I'm still in my old text signature habit due to years in the BBs scene.

Slayer_
Slayer_

Forum signatures, there is a big one for me. I love to design signatures, but although many of the forums I visit are gaming forums for high end gaming that require highspeed, the odd forum I go to is designed for dialup users. And even now, 10 or 15 years later, they are still on dialup, and I have to strive to make my signatures no more than 20kb in size. I have some excessively large ones though :). Monster Truck Madness 2, a game made in 1997 for 56k dialup. 42kb, and its fancy. http://trevorsarchives.selfip.net/funpics/sigs/trucksigxz1xe2_small.gif High res version, 126kb http://trevorsarchives.selfip.net/funpics/sigs/trucksigxz1xe2.png Image compression is very important in websites, but dev's seem to have forgotten this. This is a frightening commonality, everything is getting bigger and bigger, but with no perceived increase in quality, it just requires larger computers.

Neon Samurai
Neon Samurai

I remember my first few ISP providers. One provided 10 meg per user for a personal website. Even when I was doing business websites; we couldn't fill 10 meg. I can see how pages are at 3 meg each now though. It's obsurd but Flash, unprocessed giant resolution images, audio/video.. it adds up quick.

Neon Samurai
Neon Samurai

When most where limited to dialup speeds, websites loaded quickly or at a reasonable rate unless the web deveoloper overdid it. We now have highspeed and websites load just as slow as they did because the content layered into the site consumes the extra bandwidth. What a 20 meg video file did a few years ago, we can't do with less than a 40 meg video file. Even the weight of software on hardware resources expands to consume more powerful hardware. Dos on a 486 took XX seconds to load. Current OS platforms have magnitudes more powerful hardware but still take XX seconds to boot. Unless there is some serious innovation in how network accessed applications are written, we'll just see more of the same. Magnitudes more powerful hardware and networks consumed by magnitudes more heavy software and content.

Neon Samurai
Neon Samurai

Used within the company network, there are some benefits to centralized web applications. The moment you extend beyond the owned network, there are just too many things wrong. Security alone cancels it out. Maybe the aproach will mature further and address my laundry list of concerns; but I'm not holding my breath.