Thoran Rodrigues explains how cloud spot markets work to potentially provide cost optimizations for those purchasing computing power.
One of the great premises behind the whole cloud computing movement is the idea of optimizing the utilization of computing resources. The theory is that, since many people are connecting to large pools of computing resources over time, these pools can be distributed in the most effective manner possible, avoiding any sort of underutilization. Today, however, this is true only in theory. A lot of cloud services, especially in the infrastructure layer, are purchased on a "per month" basis. Many users still think of cloud servers simply as a "server in the cloud", that is, a server that is located outside their office, but is still up and running 24x7.
If cloud computing truly brings about the commoditization of computing resources, these resources should behave like any other commodity, and be regulated by the free market, that is, by the laws of supply and demand. If demand rises but supply remains constant, the cost of the resource should go up; if demand drops, then prices should fall accordingly. This is not what happens today. In most cases, when you hire a cloud server, that server will be available to you whenever you need it, for the price defined in your contract, regardless of how many other servers are up and running.
This is where spot markets come in. In a spot market, instead of spinning up a server and paying a fixed per-hour amount for all the hours that the server stays up, you make a bid for a certain number of hours of a certain type of server. If someone is selling server time for a price that is less than what you are willing to pay, you get the server time you wanted and the other party gets paid. If not, you may have to adjust your bid or go without the server.
Spot markets bring the power of free markets to the cloud. They allow those wanting to purchase computing power to define and control exactly how much they are willing to pay, and to avoid overpaying. They also allow cost optimizations such as running intensive computing tasks during low-cost periods. On the supply side of the equation, they allow computing resource providers to reduce excess capacity, since the lower prices make it more probable that buyers will be interested.
The most famous cloud spot market today is Amazon's Spot Instances, where users can set the maximum amount they are willing to pay per server hour, and Amazon gives them an instance for as long as the instance price is below that limit. By taking a quick look at the pricing table on the link above, it is possible to see that the prices are much lower than the price of a full instance. Prices are also constantly changing, being updated about every five minutes based on available supply and current demand. Since Amazon sells excess EC2 capacity as Spot Instances, both supply and demand can vary significantly over time, and effective cost optimization requires some monitoring of prices over time.
If they offer all these benefits, why are spot markets not the norm in the cloud world? The first reason is that applications need to be architected in a manner that allows instances to be started and stopped at any given time, instead of relying on an always-running server. In fact, there may well be several applications that simply cannot be converted to an architecture of this kind, meaning that we'll always have both spot markets and regular cloud servers. Another reason is that this model is a major departure from the way everyone is used to buy computing resources, from servers to software. Convincing people of the benefits of the cloud is already hard enough without the added obstacle of a completely different purchasing model.
There are still several obstacles to the widespread implementation of spot markets. While you can purchase most Amazon instances on Amazon's Spot Market, you can't purchase instances from any other providers, even if they happen to be cheaper at a given time. Interoperability between computing architectures is an obvious problem. The quality of the different providers is another. Even though a provider may be cheaper, I may be unwilling to move my applications to it because of quality of service or security concerns. A proper marketplace would have to take all this into account.
The presence of these obstacles means that, for the short term, we will see the appearance of several small markets, focused on single providers or on small groups that use similar technologies. As the technology matures and these markets evolve, however, larger markets should appear. There are already some moves in this direction today. OpenStack, for instance, aims to be a universal cloud operating system, so that applications could be moved between multiple providers as long as they were all using the platform. Other providers, using VMWare based technology, are already allowing users to upload and use their own virtual machine images on cloud servers. This could also allow for the easy migration of applications over providers.
I believe that, in spite of the difficulties existing today, the idea behind spot markets is the future of cloud computing. While there are applications that will always need dedicated servers that are always on, there is no reason why the price of even these "always-on servers" shouldn't be dictated by the market. A free market structure would allow the appearance of large "cloud computing marketplaces", where computing resources could be bought and sold freely by any user or provider, and where prices would be determined by supply and demand.