The cloud is an implementation of the concept of utility computing: computing resources – processing power, networking, storage and so on – being sold as service, through a distribution network, to be consumed by users as they need it. This concept is, in large part, already a reality today. It’s possible to anyone with an Internet connection (the distribution network) to purchase computing power or storage on an hourly basis, creating and allocating servers and storage space as the need arises. Several large-scale systems already make use of this dynamic to increase processing capacity while at the same time reducing costs.

There is a direct analogy with electricity: Electricity is centrally generated in large power plants, and made available through a distribution network all the way to the outlets in users’ homes. Most people don’t have to worry about where the electricity is coming from, or if the electrical equipment is going to be compatible with the electricity that is available. While there are some variations – voltage levels, outlet formats – the electrical grid is, for the most part, a standardized system that everyone can easily access and use.

Centralization

Cloud computing follows the same lines: computing power is centrally generated in large data centers, and it is also made available through a distribution network (the Internet) to end-users everywhere. Concentrating computing power on large data centers brings with it several benefits: providers can take advantage of economies of scale to reduce costs and pass savings on to users; a centralized data center can be easier to manage; security may also become easier to manage, since there may be fewer points of attack to secure; finally, concentrating computing resource generation in the hands of a few companies can lead to specialization and to improved services and offerings.

On the other hand, it also creates a single point-of-failure. If all computing power came from a single data center, all it would take to bring our world to a halt would be to bring that one data center down. Even though most large cloud computing providers offer redundant systems on multiple data centers, the fact is that, as we rely more and more on the cloud, we become more vulnerable to failures. And, as I’ve mentioned several times, these data centers become much more valuable targets to anyone wanting to wreak havoc in the system.

How dangerous is centralization? Let’s take, for instance, the hack on GitHub that was discovered and fixed recently. GitHub offers cloud-based version control repositories and software. A huge amount of software projects all over the world, from thousands of different companies use GitHub for their day-to-day needs. While no foul play has been detected yet, the potential repercussions of this event are much larger than they would be if each company had their own version control system, on their own servers, completely isolated.

Decentralization and microgrids

While centralization brings about a number of issues, the core ideas behind it – easy access to virtual resources, pay by usage, the ability to buy, or even sell, excess computing capacity – are solid. So it can be interesting to look at alternatives that can keep the advantages, but do away with most, or at least some, of the problems that I pointed out.

The obvious solution to centralization is de-centralization. To cloud computing this means that, instead of relying on large data centers to provide the computing power for end-users, to rely on the excess computing power available on PCs everywhere. In a sense, it is the same idea that has already been put into practice by the Seti@Home project: users have more computing capacity than they need, and they are willing to share this with others in exchange for credits, or even money. And this is the idea behind a very interesting new project called Cloud@Home, being developed by the same people. Through Cloud@Home, anyone with a PC can become a provider of “cloud” computing power through a virtualization platform. The project details can be found on a recent article on IEEE and on other sources.

The question is, can it work? There are several issues that need to be properly addressed by the project, starting with security and trust. With a completely decentralized system, it would be necessary to ensure the security of all data on each and every node, and on the overall platform, to avoid any breaches. It would also be much harder to build trust, since instead of hiring large companies, with established reputations, users would have to rely on other users. Ultimately, the idea of “microgrids” of computing can revolutionize the cloud landscape. Though large companies will always be necessary to provide large-scale power and highly reliable services, this idea has the potential to make the cloud much more accessible and widespread.