The story of how one data center balances customer needs, efficiency, and staying in business.
I have a great job. I get to visit data centers and talk to the smart people who run them. The data centers that intrigue me the most are those newly commissioned or just completing a major renovation. These facilities, more than likely, incorporate new energy-saving and efficiency-boosting technology.
The newly-renovated OneNeck IT Solutions data center in Eden Prairie, Minnesota is a good example. Adding 6,000 sq ft of raised-floor space, plus hints of new technology such as flywheel UPS systems, enticed me to visit the facility. This data center is one of nine owned by OneNeck; there are four more in the Midwest, two in Arizona, one in Oregon, and one under construction in Colorado.
Before going in, I drove around the complex. It is a nondescript off-white precast-concrete structure that's not that different from surrounding buildings. Turning the corner near the back of the building, I found several massive cooling towers, three equally massive back-up generator modules with the fixings for a fourth, and four olive-drab power-transformer assemblies. All giving notice this was a Tier III data center.
Cindy Klund, Marketing Programs Manager, met me at the door and like all good Minnesotans we discussed the potential for an early snow the coming weekend. A few minutes later, a self-assured gentleman with the look of experience introduced himself as Hank Koch, Vice-President, Facility, which translates into the one in charge. Koch starts by discussing the particulars. The just-added 6,000 sq ft increases the raised-floor area to 18,000 sq ft. The data center has a critical-load capacity of 3.8 megawatts and 7.0 megawatts of total-load capacity. Fire detection is through VESDA (Very Early Smoke Detection Apparatus). Fire suppression uses gaseous (under floor) and dry-pipe pre-action (overhead). All OneNeck data centers are SSAE-16 audited and meet PCI-DSS and HIPAA regulations.
The data center is design-certified by the Uptime Institute. I ask whether construction is also certified, and Koch replies that construction is not officially completed. When completed, OneNeck may certify the construction with Uptime Institute. However, he adds OneNeck prefers construction to be certified by an independent engineering source. Specifically, one who is certified as a professional commissioner by the Building Commissioning Association.
Next, I ask Koch about uptime guarantees. Without hesitating, Koch says, "OneNeck guarantees 100% uptime, anything less the customer gets refunded." Koch senses I was not satisfied with the answer. In explanation, he adds, "The emphasis on 100% uptime is important to maintaining a good reputation and not having to refund customers, but there are other considerations as important if not more so. This facility's client list includes several government agencies and healthcare providers. Those organizations expect uninterrupted availability. OneNeck's commitment is more than enough incentive to go the extra mile and have the data center capable of providing services when they are needed the most -- emergencies."
Services provided by OneNeck
This data center offers what Koch calls Five Pillars of Service: Cloud and Hosting Solutions, Managed Services, ERP Application Management, Professional Services, and IT Hardware Resale.
The services are fairly standard throughout the industry, though ERP Application Management and IT Hardware Resale seem unique. OneNeck data centers offer in-house applications such as Oracle ERP set up and ready for customers to use.
Another advantage for clients using OneNeck's applications is that all OneNeck data centers are connected with 10Gb fiber in a self-healing mesh network. Companies with remote offices can reach the application via the 10Gb network, their network, or via internet VPN connections.
The fact that OneNeck resells hardware and software is interesting as well. If a company would rather not deal with equipment setup or lacks in-house expertise, OneNeck can step in and make it happen.
The tour begins
As we begin the tour, I ask Koch about the Department of Energy's Better Buildings Challenge. With a "been there done that" look, Koch tells me environmental concerns and business concerns -- when it comes to data centers -- are one and the same. Companies managing data centers are leveraging every possible way to use less electricity, simply because it is good business sense. Koch adds one way OneNeck increases efficiency is to undertake an extensive environmental study of each data center, giving special consideration to where the data center is located. For example, the data center in Eden Prairie, MN has different cooling needs than the OneNeck data center in Oregon. All the data is analyzed, and what I am now going to see are the results.
First stop was the Facility Command Center (FCC). Here, everything that goes on inside the data center is registered and/or controlled by the staff. The room is manned 24 hours a day, every day.
The left screen in the FCC (image to the right) being linked to the building's DCIM is full of interesting information, especially the real-time power usage effectiveness (PUE) display. As Koch explains what I'm looking at, the PUE display hovers around 1.4 to 1.5.
Then we move to what Koch calls Carrier/Meet-Me-Rooms. Koch points out OneNeck data centers are carrier-neutral facilities. Some of the carriers having Points of Presence located in the data center's carrier/meet-me-rooms are TDS Telecom, CenturyLink, TW-Telecom, Eventis, and Zayo. These rooms are where OneNeck client circuits are connected to the telco provider of their choice.
The raised-floor computing area
Next stop is the raised-floor area. We enter through a hallway that, in essence, could be considered a people trap. If the entrance door opens, all interior doors remain locked to maintain integrity. Once everyone is in the hallway and the entrance door shuts, the other doors unlock. Each door also requires multi-factor authentication to gain access: RFID card, iris (both eyes) biometric scan, and a PIN to open. Each door employs a tailgate sensor that works better than I expected. I could walk alongside someone and the sensor would still sound an alarm.
My first impression of the raised-floor area: it's quiet. That's odd as the data center uses forced-air cooling. Here's what I found out.
Beneath the raised floor is considered the cold-air plenum. Perforated floor tiles are only located under the racks, directing the airflow to where it does the most good -- up through the rack-mounted equipment. The computing area has a dropped ceiling. The area above the ceiling acts as the hot-air return.
The hot air gushing out above the racks is contained on its way to above the dropped ceiling by using either rack-manufacturer supplied ducting or custom sheet-metal ducts built by OneNeck. The fastidiousness that OneNeck goes through to create a closed-loop system is why the raised-floor area is quiet, thus more energy-efficient.
Another touch: Koch mentions OneNeck employees are always checking customer cooling hygiene, "Sometimes customers need reminding to fill all open rack slots, so no hot air escapes into the room environment." A simple thing, but little things add up.
Cooling in Minnesota?
Cooling the air is handled by N+1 30-ton Liebert Computer-Room Air-Conditioning (CRAC) heat exchangers. The returning hot air is either conditioned by a water/glycol pumped through the cooling tower heat exchangers outside the building, or if it's an especially hot, humid summer day, mechanical air-conditioning units in the CRAC units take over. Koch mentions the data center achieves free-cooling mode when the cooling towers are absorbing the entire heat load.
There is one concern with cooling towers in Minnesota: it is supposed to snow tomorrow (October 2nd). That means freezing temperatures, and bad news if the cooling towers were still spraying water. No longer a concern for the data center personnel, that responsibility belongs to the Honeywell Building Automated System (DCIM). If the temperature drops below freezing, the Honeywell system turns off the cooling tower's fresh-water pumps and drains the lines. Besides controlling the raised-floor temperature and humidity, the Honeywell system manages every aspect of the building environment, all power systems, physical security, and asset management.
There is an advantage to long, cold Minnesota winters. Outside temperatures are low enough to cool the water/glycol mixture from the CRAC units as it flows through the heat-exchanger portion of the cooling towers.
Next stop on the tour: electrical rooms
As a Tier III data center, OneNeck abides by N+1 Redundancy. Local grid power comes from three separate substations and mates with 2500 kVa transformers out back. If local power drops off line, there are three (soon to be four) N+1 3 megawatt Caterpillar V-16 diesel generators, each with 3,000 gallons of fuel. Koch asserts that each generator costs over $1 million. Imagine having $4 million worth of generators, plus whatever 12,000 gallons of diesel fuel costs sitting out back and hoping the equipment will never be needed.
Environmentally-friendly UPS systems
Power from the grid or back-up generators is fed to N+1 900 kVa UPS systems. OneNeck's UPS systems have two tasks:
- Condition all power whether from the local grid or back-up generators
- Power the building long enough for the back-up generators to take over
The UPS systems at this data center are designed to work for 60 seconds. Plenty of time. Within nine seconds, the back-up generators are started, up to speed, and supplying power.
I noticed something, asking Koch, "Where are the batteries?" He replies, "There are no batteries. OneNeck uses Flywheel UPS technology instead of batteries, which saves us money, and not having to deal with batteries helps the environment." How the flywheel works in simple terms: when the UPS senses loss of grid power, the flywheel/generator, still spinning at high RPM, starts converting its rotational momentum into electricity powering the complex until the generators take over.
Attention to details
While driving home, a thought crossed my mind. Of all the data centers I have visited, this one seemed different. Everything was in the right place -- there was nothing haphazard, and every detail was considered. For example, while looking at the back-up generators, Koch told me something that had not occurred to me. OneNeck mandates that diesel-fuel suppliers must incorporate alternative methods (gravity feed, for example) of filling fuel-delivery trucks before OneNeck will sign a contract. The logic is easy to understand: if the generators are running, there's a good chance the diesel-fuel supplier's power is out as well.
It is the little things that add up.