Open Source

Open computing is dead (long live open computing!)

What went wrong on the journey to the open future, and does open have a place in the future of computing?

Once upon a time there was something called the "Open Source Movement," proponents of which envisioned a utopian future. The story went that consumers and businesses would choose everything from their desktop operating system to their network routing software from a plethora of freely available options. They might even tweak one of these tools to engender a minor improvement or make it more suitable for their particular environment, and that tweak would then be shared out of the kindness of their hearts, or through a contractual agreement through one of the high-minded sounding distribution agreements that accompanied open source code. Our computing future would be ruled by gifted and benevolent software developers rather than greedy corporations, and our various open source applications would happily interact with each other over open networks, using open standards.

A funny thing happened on the way to the open computing future, however, in that vendors and consumers alike voted with their wallets for closed systems, in many cases seemingly adopting a stance of "the more closed the better." Despite repeated attacks, Microsoft remains the dominant computing platform in the enterprise space, reigning over desktops and large swaths of the data center. On the increasingly important mobile front, the industry leader Apple is a poster child for closed systems, controlling hardware, operating system, and application distribution with an iron fist. Android, which was supposed to save us from the supposed bane of closed systems, sports a closed device as its most popular tablet: the Amazon Kindle Fire, a heavily modified and buttoned-down Android tablet designed to sell Amazon content rather than showcase open standards. So, what went wrong on the journey to the open future, and does open have a place in the future of computing?

Open computing's stack problem

Technically minded folks love talking about "stacks" when discussing various computing architectures, referring to the various components, from low level to high level, that deliver an application or computing service. A stack might refer to the low-level network protocols used by an application, the corresponding development tools, the application itself, and its associated data. Open computing has traditionally had great success in the supporting roles of most application stacks. We'd presumably be lost without the TCP/IP network protocol that underlies the Internet and the vast majority of corporate networks, just as the web itself would be a shadow of itself without open source applications that provide database and web servers in the guise of projects like Apache and MySQL. These support applications, while critical to the overall service, remain mercurial background players to most modern applications. While an average user might interact with dozens of Apache and MySQL servers in an afternoon's work, he or she would likely return a blank stare when asked what they knew about these applications.

Similarly on the enterprise front, Linux-based code is likely present in many of the devices that adorn the server racks of most data centers, yet the underlying open source elements are buried in the depths of the application stack. The network admin and purchasing departments care little about the open source code that made the device possible, and credit the name on the shiny box with the underlying innovations.

At the end of the day, open computing has never been able to create a successful, integrated computing environment. The best open operating systems like Ubuntu are speedy and pretty, and come with a raft of applications that easily bests offerings from Microsoft or Apple. Once you get beyond the impressive checklist of features, however, it quickly becomes apparent that Ubuntu is a collection of bundled components rather than an integrated computing experience, a problem that is difficult to resolve when different teams or organizations are responsible for each component.

Will the open computing utopia ever exist, and what does this mean for CIOs?

The annals of technology history are littered with billion-dollar companies that were unable to internally integrate hardware, software, and applications, eventually leading to their demise. This is no easy task, and one that a movement that's designed to be independent and leaderless will likely never succeed at. This doesn't mean that we should write off open source; some of the greatest computing innovations have originated from this movement and, at its finest, open source has provided unparalleled building blocks for modern applications at commodity or even zero cost.

Highly integrated "experiential" devices like mobile phones, desktops, and tablets, however, are not opportunities for open source to shine. Technology pundits love throwing stones at "closed" environments and "walled gardens," but in most cases these environments are the most tightly integrated and functional. Rather than joining a fruitless debate about open versus closed computing, evaluate available technologies in light of how they'll solve your business problem, rather than joining a borderline religious debate.


Patrick Gray works for a global Fortune 500 consulting and IT services company and is the author of Breakthrough IT: Supercharging Organizational Value through Technology as well as the companion e-book The Breakthrough CIO's Companion. He has spent ...

Editor's Picks