One of the biggest costs for any data center is energy usage. It’s possible to reduce your
carbon footprint and save money every day, not just on Earth Day, by choosing energy-efficient
servers.

Buying a server is a long-term investment. It’s not only
the cost of hardware and software that matter, but the power costs to
make that server consistently available to users. Add in a fleet of
servers in a data center operating 24 hours a day, and power costs
quickly add up.

Download the full toolkit from ZDNet’s sister site TechRepublic: Cost Comparison: Calculating Server Power Usage.

For instance, one server can use between 500 to 1,200 watts per hour, according to Ehow.com.
If the average use is 850 watts per hour, multiplied by 24 that equals
20,400 watts daily, or 20.4 kilowatts (kWh). Multiply that by 365 days a
year for 7,446 kWh per year. According to the US Energy Information Administration (PDF),
the average kWh cost for commercial use from January 2012 through
January 2013 was 9.83 cents. So that means it would cost $731.94 to
power the aforementioned server for one year.

Add in the fact that
energy costs vary around the country, with some larger metropolitan
areas and remote spots such as Hawaii costing upward of three times the
national average, and you can easily see why server energy usage is so
crucial to a company’s bottom line.

You can use the TechRepublic
toolkit to calculate server power usage. It includes a spreadsheet that
can help provide an average baseline of what you can expect to pay in
energy costs for old/existing servers versus new servers. The list
includes many common servers available now, with IBM, HP, and Dell among
those included, as well as Oracle, Fujitsu, and Cisco.

Avoid common pitfalls in determining server energy usage and download the TechRepublic toolkit on Cost Comparison: Calculating Server Power Usage

This article originally appeared on ZDNet as part of its Data Centers special feature.