Power management is both a discipline and an application. Companies such as Cassatt are eager to sell automated power management products to data center managers, and these products fill an important need.

But before applying automated tools to the problem of power management, prudent data center managers should perform an energy use assessment and audit. Not only does this create a baseline for measuring future savings and Green IT benefits, it will also identify areas for immediate improvement. A data center design review, with energy usage in mind, will often reveals simple faults that can waste large amounts of energy.

Most Green IT initiatives, from virtualization and consolidation to power management and data center redesign, have the primary goal of energy efficiency. By conserving energy in IT operations, we achieve all the meaningful benefits of an IT sustainability program: We decrease greenhouse gas emissions and our firm’s carbon footprint; we responsibly utilize the resources of our data center by reducing acquisition and maintenance costs; and we save on operating expenses.

Now let’s turn our attention to case studies, survey and report data, and reports to learn about Green IT initiatives and how some companies are using power management. These metrics, along with a brief look at why some IT leaders are skeptical about power management, offer insight into how power management can lead to financial success and corporate responsibility.



In 2007, IT consulting company EDS demonstrated it was socially responsible by complying with the U.S. Environmental Protection Agency’s (EPA’s) ENERGY STAR specifications for reducing computer energy use. In addition to the positive message of reducing its carbon footprint and greenhouse gas emissions, EDS had another powerful incentive: The company could save about $480,000 a year. Its plan didn’t involve wholesale replacement of servers or storage devices or virtualizing and consolidating the data center. It also didn’t require the company to pipe chilled water into its facility or install solar panels on the roof. EDS simply used the existing power management capabilities of its 90,000 desktop PCs to turn off the power when idle. This “low-hanging fruit” seemed like a perfect place to gain quick dividends from an energy-efficiency project.

Yet even this basic attempt to reap energy savings ran into complications. Some applications didn’t respond well to being turned off and had problems coming back online; backup operations had to be rescheduled; and systems disappeared from management consoles, setting off alarms. EDS was forced to slow down and take a multi-phase, conservative approach to this simple project.

EDS’s experience seeking energy efficiency tells us a couple of things. First, applying the most basic Green IT tactics, such as turning off the lights, can reap significant green rewards and cost savings. And, second, even the simplest tactics have complications.


Cassatt surveyed 215 IT and facilities personnel in late 2007 and early 2008 to assess their practices and opinions. While more than 40% of CIOs surveyed have an energy-efficiency mission in their organization, only 18% follow EDS’s path and turn off PCs. Less than a quarter of the respondents said they use power management on their servers. And more than 40% said that under no circumstances would they consider turning off their servers — no matter what the energy benefits.


In an EPA study of data center power usage, the organization outlined three improvement scenarios for data center managers to follow:

  • In its improved operations scenario, applying power-saving techniques such as turning off idle computers, the EPA estimates a 20-30% reduction in energy growth trends is feasible.
  • In its best practice scenario, the EPA suggests that by adding a few more sophisticated techniques, such as replacing both computer gear and power and thermal equipment with new, energy-efficient models, the potential savings jump to 45-70%.
  • In its state of the art scenario, which includes applying every proven energy-saving component and practice (such as aggressive server consolidation, completely automated power management, and next-generation cooling and power-distribution gear), the EPA notes that organizations could achieve up to 80% savings.

Corporate Executive Board

In the Corporate Executive Board’s report Green IT Initiatives, the board suggests the bulk of savings in the data center accrue from fixing obvious flaws, which include the following:

  • floor layout, with ventilating cutouts pointed in the wrong direction or blocked by equipment and cables;
  • inefficient lighting that wastes power and heats the room; and
  • uncoordinated cooling, with cooling vents often pointed in the wrong direction.

This report, by an independent organization serving the interests of CIOs, provides a clear path to energy efficiency both in the data center and on the desktop. By renovating existing facilities to fix the flaws mentioned above, and by introducing new computing and power-thermal gear with high-efficiency ratings as they become available, the report concludes that large gains in energy efficiency can be realized. The power-off techniques adopted by EDS on the desktop are also an important component of the Corporate Executive Board’s proposals.


Another interesting report is from Pacific Gas and Electric Company (PG&E) of California entitled High Performance Data Centers. Sometimes it’s surprising to see electric utilities guiding customers to use less of its product, but anyone who has observed the blackouts in California in recent years or heard the debates about building coal-fired or nuclear power plants in any jurisdiction, understands why PG&E is motivated to help customers go green. In fact, 24 utilities from around the United States have created a group called the IT Energy Efficiency Coalition and have developed innovative programs to help IT become more efficient. Seattle City Light is offering a rebate to customers who install power management software, and BC Hydro is offering incentives to clients who consolidate servers.

PG&E’s data center guide offers technical advice and case studies in 10 categories, some of which may seem only tangentially connected to energy efficiency, such as Air Management and Humidification Controls; yet, PG&E’s document makes obvious the connection between these strategies and the greening of IT. In its chapter on Air Management, PG&E demonstrates that the adoption of better data center design practices, such as racking servers in a hot-aisle, cold-aisle configuration, can save up to 60% on cooling costs. The use of free, outside air to cool data centers is a topic of great interest, and PG&E’s report indicates that the use of chilled air collected from the outside atmosphere, and water chilled by outside air and circulated through the data center, can reduce cooling costs by 70%. The report notes that, even in warm climates such as San Jose, nighttime chill and cooler days would enable outside “free cooling” about 35% of the time.

For IT executives and data center managers who plan to embark on an energy-efficiency program, the CIO Executive Board report and the PG&E design guidelines are foundation documents.

The Green Grid

The drive towards sustainable IT has encouraged the creation of metrics that claim to quantify energy usage and apply objective math to the measurement of data center efficiency. The Green Grid, a consortium of IT industry experts whose stated mission is to “develop standards to measure data center efficiency,” has presented a series of proposals for IT facilities power measurement. The Green Grid proposes two key metrics for data center efficiency: Power Usage Effectiveness (PUE) and Data Center Efficiency (DCE).

PUE is defined as: 

Total Facility Power
IT Equipment Power

Total Facility Power measures the energy load of all the facilities and equipment that support the computing gear in the data center. IT Equipment Power measures only the direct load associated with computer equipment, including attached network, storage, and print devices. This formula is designed to guide data center designers and managers towards high-efficiency computing resources that require lighter support equipment.

A PUE of 1 would indicate complete energy efficiency, with all power going only to computing equipment, while a PUE above 3 would indicate room for improvement. PUE and DCE, which reports the percentage of power going to computing gear, are asserted by The Green Grid as the key metrics for Green IT initiatives that want to quantify their gains.

Even the area of Green IT metrics has generated some controversy. The Uptime Institute, another influential organization focused on efficiency, offers its own set of metrics. The argument over metrics touches on questions such as where the load is measured and how it relates to productivity or computing power. The contributions of both groups help data center managers navigate the drive towards greening the data center.

The EPA and the Corporate Executive Board agreed that most data centers are still a long way off from the need to accurately measure consumption by kilowatt hour and are more likely to benefit greatly from incremental improvements.

Skepticism about power management

It’s hard to blame IT executives for being skeptical about power management. The difficulties EDS experienced with desktops are magnified in the data center. Applications and process are running in the background, and sophisticated data integrity and fault-tolerant procedures are taking place. For most large organizations, especially transaction-oriented businesses such as banks, there are no off-hours for servers. The idea of an unattended software agent randomly flipping the switch on their critical servers sends chills down some CIOs’ spines.

In IT, there are still fixed ideas about powering servers off that impede implementation of even basic power management. Many IT managers believe that powering servers off and on reduces server or disk reliability and longevity, even though IBM and Hewlett-Packard specify its servers for 40,000 on-off cycles — many more than are likely in a five-year duty cycle. Some IT leaders worry about application availability and may not know that modern power-management software is application aware. Departments that “own” their application servers often resist active power management for many of the same reasons these users sometimes resist virtualization: They want to retain traditional control of their servers, and they don’t trust automated operations that threaten to disturb their unlimited application access.

Follow a disciplined path

Don’t let skepticism get in your way of at least exploring power management techniques and Green IT initiatives.

Automated power management products, such as Cassatt’s Active Response power management products, can be a key enabler for controlling the power and thermal expenditures in your data center. Before you look into these products, the first step is to understand your current situation. By tackling the fundamental issues, such as data center rack placement, cooling effectiveness, and tile layout, IT managers can gain “quick hits” that create the positive enthusiasm to move to the next level of sustainable IT. The follow-up steps, such as integrating outside air cooling or using alternative sources of energy, require more consensus and investment, and should be undertaken once the “low-hanging fruit” efforts demonstrate results.

By following a disciplined path as laid out by PG&E and the Corporate Executive Board, and by measuring results using metrics offered by The Green Grid , IT leaders can make the greening of IT using power management techniques an exercise in both financial success and corporate responsibility.

TechRepublic’s Servers and Storage newsletter, delivered on Monday and Wednesday, offers tips that will help you manage and optimize your data center. Automatically sign up today!