Provided by: International Journal of Advanced Research in Computer Science and Software Engineering (IJARCSSE)
Date Added: Jul 2013
In recent years, IT infrastructures continue to grow rapidly driven by the demand for computational power created by modern compute-intensive business and scientific applications. However, a large-scale computing infrastructure consumes enormous amounts of electrical power leading to operational costs that exceed the cost of the infrastructure in few years. For example, in 2006 the cost of electricity consumed by IT infrastructures in US was estimated as 4.5billion dollars and tends to double by 2011. Except for overwhelming operational costs, high power consumption results in reduced system reliability and devices lifetime due to overheating. Another problem is significant CO2emissions that contribute to the greenhouse effect .One of the ways to reduce power consumption by a datacenter is to apply virtualization technology.