Data centers in the 21st Century
Image 1 of 25
Healthcare data center
ntLoccioni Humancare is an Italian company that provides technological solutions for the healthcare industry. This data center is in a hospital where it provides a repository for patient data, images and clinical data. Its design is optimized for energy efficiency and ease of maintenance.
n
ntThis Wall Street Journal video features Loccioni’s “green” philosophy for sustainable living and design.
n
ntPhoto Credit: Thomas Farina (See more photos from this data center on Farina’s Flickr page.)
IBM's research labs - Cooling techniques
ntCabinets are fitted with rear-door heat exchangers, which suck in the hot exhaust air of the servers and blast it out again as cold air. The technology – the IP is owned by IBM – can take away up to 60 percent of the heat in a full rack, the company claims.
n
ntThe result of these changes is a datacentre PUE (power usage effectiveness) of 1.27, which generated a return on IBM’s investment within 10 months.
n
ntSee Cool runnings: IBM’s recipe for a happy datacentre, in pictures by Jon Yeomans
IBM's research labs in Poughkeepsie, New York
ntRacks and cable view
n
ntSee Cool runnings: IBM’s recipe for a happy datacentre, in pictures by Jon Yeomans
Equinix 1
ntEquinix is a provider of global data center solutions.
n
ntView of overhead cabling infrastructure inside the Equinix SE3 data center facility in Seattle, WA.
n
ntPhoto Credit: Equinix
Equinix 2
ntView of customer cages inside the Equinix SE3 data center facility in Seattle, WA.
n
ntPhoto Credit: Equinix
Equinix 3
ntExterior view of the Equinix SE3 data center facility in Seattle, WA.
n
ntPhoto Credit: Equinix
Easy Street data center
nttheEasyStreet, an IaaS provider, offsets all resources it consumes, resulting in zero carbon footprint operations in its Beaverton, OR data centers and offices. EasyStreet’s newest data center represents nearly four years’ research and design equity to build the most energy-efficient colocation facility in the region. Energy-saving technologies such as Indirect Evaporative Cooling and chimney cabinets reduce costs as well as carbon emissions while directing more power to customer cabinets.
n
ntPhoto Credit: EasyStreet
Easy Street machine room
ntAn innovative Indirect Evaporative Cooling (IEC) system from AMAXu00ae is key to efficiency. Direct Expansion (DX) packages provide supplemental cooling when required. (Estimated need is only 180 hours per year.)
n
ntPhoto Credit: EasyStreet
Easy Street rooftop IEC
- nt
- nttRooftop IEC units use an N+1 configuration.
- nttEach IEC unit is connected to a shared supply and return trunk, allowing any spare unit to replace any failed unit.
- nttHumidity is controlled with ultrasonic humidifiers operated by the master control system.
- nttOverhead duct work distributes cooled air throughout the data center.
- nttThe equipment cabinets are part of the cooling system. Chatsworth Products, Inc. (CPI) passive chimney cabinets are connected to the overhead return air duct work. They gather hot exhaust air and route it to the roof for processing.
- nttA 25,000-gallon underground tank collects rainwater from the roof and supplies an estimated seven months of annual operation. It is filtered and used by the IEC units to cool the hot air exhausted through the return duct. Reclaimed water is supplemented by city water when necessary.
- nttEfficient Uninterruptible VYCONu00ae Power Supplies (UPSs) use flywheels as their power source and require no acid-containing batteries.
nt
nt
nt
nt
nt
nt
n
n
ntPhoto Credit: EasyStreet
Submerged data center
ntSubmerging your data center in fluid is definitely a next-generation method of temperature-control, but as ZDNet’s David Chernicoff reported in “Submerge your data center,” it is an intriguing option. Basically, specially designed server racks are submerged in a “dialectric” fluid, very similar to mineral oil. System fans aren’t needed, hard drives are encapsulated and airtight, and the thermal grease between the CPU/GPU and heat sink is replaced by Indium foil.
Cold War bunker turned cloud data center
ntAnother innovation in data center design puts old infrastructure to new use. In this case, a Cold War bunker in Switzerland was refitted to hold Radix Technologies cloud data center. Dug into Alpine rock, this installation was built to withstand nuclear blasts and features steel doors and reinforced concrete walls. Pictured here is an access tunnel between technology corridors. This is part of Toby Wolpe’s gallery, “Inside the Cold War bunker that’s now a cloud datacentre,” where you can view more photos of this former military site.
NextFort Data Center (Chandler, Arizona) - Server corridor
ntThe NextFort High-density Computing Suite (NextFort HCSTM) is a completely self-contained, all concrete and steel room designed to house up to 20 high-density IT racks for a total IT load of up to 225KW.
n
nt
n
ntPhysical Specifications
n
- nt
- nttAll concrete and steel construction
- nttSecondary concrete perimeter wall
- nttSingle row of 20 45U high density 19″ IT racks
- nttIsolated hot and cold aisles
nt
nt
nt
n
n
ntPhoto Credit: NextFort
n
NextFort - Cold aisle
nt
n
ntPhoto Credit: NextFort
n
NextFort - Hot aisle
nt
n
ntPhoto Credit: NextFort
n
NextFort suite diagram
ntElectrical & Cooling
n
- nt
- nttUp to 225KW IT load per suite (up to 600W per square foot)
- ntt2N redundant electrical feeds with separate individually metered panels
- nttUp to 60 tons of dedicated redundant ultra-efficient DX cooling
- ntt100% free air capability based on outside conditions
- nttBetter than 1.25 PUE
nt
nt
nt
nt
n
n
ntFire Suppression
n
- nt
- nttIndependent clean agent gas-based suppression system in
ntteach suite - nttNo water-based fire sprinklers
nt
n
n
ntManagement & Control
n
- nt
- nttSecure web-based customer portal to manage environmental settings (temperature, humidity, outside air control), physical access and security cameras, and to monitor power usage in the suite
n
n
nt
n
ntPhoto Credit: NextFort
n
Verne Global data center complex in Iceland
ntThe 45-acre Verne Global data centre complex is located near Reykjavik, Iceland, minutes from Keflavu00edk International Airport. The facility is situated on the site of the former Naval Air Station Keflavik, a key strategic NATO base for over 50 years and chosen for its extremely low risk of natural disaster. Located well to the west of all of Iceland’s volcanic activity, arctic breezes and the Gulf Stream push volcanic effects away from the Verne Global site and toward Western Europe.
n
ntPhoto Credit: Verne Global
Geothermal valve - Verne Global
ntVerne Global’s facility has been designed to fully utilise Iceland’s unique environmental power advantages: 100% powered by renewable energy resources and 100% cooled by the natural environment of Iceland, without the use of chillers or compressors.
n
ntAll electricity comes from 100% renewable, geothermal and hydroelectric energy. Iceland is the only country in Western Europe that still has large indigenous amounts of competitively-priced hydroelectric and geothermal energy remaining to be harnessed.
n
ntPhoto Credit: Verne Global
Verne Global data center entrance
ntHigh security features include:
n
- nt
- ntt 24×7 manned security force with roving patrol
- nttSingle point of entry, lobby with ballistic-resistant glass
- nttNine challenge points to reach data center with multiple mantraps
nt
nt
n
n
ntPhoto Credit: Verne Global
Verne Global secure cabinets
ntPhoto Credit: Verne Global
Digital Realty in Western Sydney - Modular datacentre approach
ntThe 8,020-square metre facility contains four 1440kW Digital Realty Turn-Key Flex pods, a modular product to provide flexibility for customers to scale their capacity.
n
ntPhoto Credit: Digital Realty
n
ntSee Digital Realty officially launches Western Sydney datacentre by Spandas Lui
Digitial Realty in London - Five halls
ntEach data hall has staging workspace for technical operations, with lift access, power and cooling facilities throughout. The vast data halls are also kept cool with the help of 14 CRAC (computer room air conditioning) units that can be found operating in an N+2 configuration (providing back ups) in each of the 1,000 square foot pristine data halls.
n
ntThe datacentre halls are raised 4m above ground level and have an 800mm raised floor. The floor can take loads of up to 12.5KN/mu00b2.
n
ntSee A photo tour of Digital Realty’s newest London datacentre by Sam Shead (ZDNet)
Rackspace in the UK
ntRackspace’s Slough-based datacentre houses the hardware for its UK cloud, along with the other servers rented by its customers. It has 1,600 racks in place, of which 120 support its cloud.
n
ntRackspace’s server hardware is predominantly supplied by Dell. It operates a multi-vendor networking approach: Cisco is the predominant provider of switching technology, while Juniper Networks supplies backbone services and Brocade provides equipment for load balancing.
n
ntPhoto credit: Jack Clark
n
ntSee Inside Rackspace’s UK cloud datacentre for more photos.
Rackspace cooling system
ntThe datacentre has a power usage effectiveness (PUE) rating of 1.7, though Rackspace hopes to lower this to an average of 1.59 once the third data hall is built.
n
ntCooling is achieved via computer room air-conditioning (Crac) units that pass cold air through the underfloor plenum. The current is then lifted into the servers and expelled as hot air at the back. Unlike numerous other datacentres, the racks are not enclosed.
n
nt
n
ntPhoto credit: Jack Clark
n
ntSee Inside Rackspace’s UK cloud datacentre for more photos.
Virtus security controls
ntDatacentre security is vital: there are biometric readers and secure mantraps on both levels of the datacentre that control access to the datahalls and are programmed so that the first set of doors must close before the second set opens. Each door has a proximity card reader that is pre-programmed with a client’s access information.
n
ntSee From warehouse to (data)warehouse: Virtus keeps it cool in North London by Sam Shead
Las Vegas Sands
ntThe Las Vegas Sands Corporation runs two casino resorts in the gambling mecca, the Venetian and the Palazzo.
n
ntThe two establishments run off the same infrastructure, which uses 300 servers to support more than 11,000 suites and rooms, 3,000 slot machines and 200 gambling tables, and run nine websites. The bulk of the Venetian and Palazzo’s core systems run on six IBM’s iSeries servers (pictured), formerly known as AS/400s.
n
ntPhoto Credit: David Meyer
n
ntSee Datacentres of the world: A photo tour by ZDNet Staff
-
Account Information
Contact Selena Frye
- |
- See all of Selena's content