Did you know that a smartphone uses
more electricity than a refrigerator? It does, according to The Breakthrough Institute. The average refrigerator uses approximately 320 kWh per year; the average smartphone consumes 388 kWh per year when everything needed to
make a smartphone work—the wireless connection, data usage, and battery
charging—is considered.

This is how Max Luke, a policy associate at The Breakthrough Institute, calculated the energy usage of an iPhone:

“Last year the average iPhone customer used 1.58 GB of data a month, which times 12 is 19 GB per year. The most recent data put out by ATKearney for mobile industry association GSMA (p. 69) says that each GB requires 19 kW. That means the average iPhone uses (19kw X 19 GB) 361 kWh of electricity per year. In addition, ATKearney calculates each connection at 23.4 kWh. That brings the total to 384.4 kWh. The electricity used annually to charge the iPhone is 3.5 kWh, raising the total to 388 kWh per year. EPA’s Energy Star shows refrigerators with efficiency as low as 322 kWh annually.”

When it comes to power usage, smartphones are just one piece of the puzzle that comprises the Internet Communications Technology (ICT) ecosystem (Figure A).

“ICT is
a more specific term than Information Technology stressing the role of unified
communications and the integration of telecommunications (telephone lines and
wireless signals), computers as well as necessary enterprise software,
middleware, storage, and audio-visual systems, which enable users to access,
store, transmit, and manipulate information.”

In this Digital Power Group report (PDF),
author Mark P. Mills states that the amount of electricity used by the ICT
ecosystem will soon surpass all other usage types (Figure A). The blue section of the graph represents the ICT ecosystem. Mills associated the increase in energy demand
with the dramatic surge in
Internet traffic (i.e., the amount of traffic traversing the Internet in one hour today equals the entire amount of Internet traffic from the year 2000).

Figure A

 

 

While smartphones are a small cog in
the ICT ecosystem, data centers are a major component, and a significant line
item on the ICT’s annual electrical bill. Figure B, which is also from the Digital Power Group report, graphs the current
annual data center TeraWatt-hour (TWh) yearly totals along with the annual projections
for the next 10 years. More importantly, the graph reflects the same steep
upward trend in electricity demand as displayed by the ICT ecosystem in the
Global Electrical Demand graph above.

Figure B

 

 

Mills added, in the next decade, the
amount of electricity used by the world’s data centers will approach 1,000 TWh, more
electricity than used by Germany and Japan combined, two heavily industrialized
countries.

Rating data center efficiency

I know what my home electricity bill
runs per month, but extrapolating that to a data center’s monthly electrical
bill seems impossible. The way my friend who manages a data center frets over
the electricity bill, I do not want to know either.

Data center bills are always going up—new servers and ancillary equipment are added all the time.
So, it is an ongoing battle: new servers use electricity and increase the heat
load, which means more air conditioning, and more air conditioning means using even
more electricity.

My friend and other
data-center managers need tools to judge how their data centers are doing in terms of electricity usage.
Currently, that tool is a standard metric introduced by The Green Grid in 2007 called Power Usage Effectiveness (PUE). The Green Grid defines
PUE as the ratio of electricity used by the total facility to electricity used
by IT equipment (Figure C).

Figure C
  

The ideal PUE value would be 1, meaning
100 percent efficiency. My friend will not tell me the PUE for his data center,
but The Green Grid in its 2012 report stated that Lawrence Berkeley National
Labs measured 22 data centers and the PUE ranged from 1.3 to 3.0.

Real-time PUE

A growing number of efficiency experts
are concerned that PUE, as currently calculated, is an average—PUE derived over
some set calendar period—and averaging makes it difficult to spot problems and
inefficiencies. I asked my friend about this. He told me there are companies
that track PUE in real-time, but most do not; doing so requires significant
financial investment to install monitoring systems needed for real-time PUE.

Companies like Facebook and Google (Google’s PUE summary) with new data centers and the prerequisite money do use
real-time PUE. In fact, Facebook has the real-time PUE for the Prineville, OR data center
posted on the Internet (Figure D).

Figure D

 

 

I keep comparing data-center power
usage to what I know about my house: its heating, cooling, and ultimately the electricity
bill. To get down to a PUE of 1.1 in a cavernous 330,000 square-foot data
center seems amazing to me.

Do you measure your data center’s power usage? If so, how do you measure it? If you use PUE, do you calculate real-time PUE? Let us know in the discussion.