Emerging Tech

Server room makeover: Minor improvements can go a long way

Serving a dynamic campus community out of a repurposed classroom in a decades-old building has its challenges. Scott Lowe describes his team's efforts to revitalize a college's neglected data center.

Westminster College's IT infrastructure has grown throughout the years in a very organic way. In the beginning, it was only a single VAX terminal connected to the University of Missouri; now it's a fiber-connected network with thousands of nodes, dozens of servers, and terabytes of storage.

I'm very happy with where Westminster College is headed in terms of infrastructure and architecture. However, one area that has not been up to par is the physical location where we house our infrastructure. We've spent time fixing up what used to be a basement classroom in the campus administration. I wish I'd had the forethought to take pictures of what the server room looked like a while back, but I didn't; I did take a few "after" photos. In this post, I describe what improvements our IT team made and explain why we made those changes.

Details about the "before" server room

Our server room is a repurposed basement classroom that was granted to IT well before my arrival at the college. The room measures about 15 x 15, has a concrete floor, and has a drop ceiling that is a few inches below the floor structure supporting the floor above.

When I started at Westminster, the room looked like the diagram in Figure A. This isn't a scale drawing, but you can see where the various components were placed. Figure A

The old server room layout

There were three server racks (none of which were full) that housed the college's couple dozen servers. The campus fiber optic network at the time was all multimode and terminated in the network core you see on the diagram. The blue box represents the cabling for the building housing the server room. All of that cabling is terminated into a wall mount rack and feed the systems and users that work in Westminster Hall, the building in which the data center resides.

As you can see, there were three lights in the room. The lights provided enough light to work by but left some darkish spots.

The two green boxes on opposite walls of the room were patch panels that were connected to one another across the room with the cabling running above the drop ceiling; this was the cabling used to connect the servers to the core switch. Without a raised floor, this was a decent alternative, but it didn't provide for much growth and working on it was a pain.

The A/C unit in the back of the room is a standalone unit with four air vents blowing out the top of the unit and pointing in various directions to cover as much of the room as possible.

The room has no raised floor, and it never will -- it's a basement with a concrete basement floor. Although we could jackhammer out enough material to create a raised floor, it's not really necessary anymore.

The room had a lot of problems, including:

  • Inadequate air flow. The Dell servers we use blow air from front to back; there was about a foot of space between the rear of the racks and the wall, and it was hot back there! Plus, cold air wasn't being delivered where it was needed most.
  • Inadequate lighting. As I mentioned, some areas of the room simply were not lit well enough.
  • Really difficult to work on systems. There was very little space between the rear of the racks and the walls, making it very difficult to work behind the systems.
  • No flexibility to expand network core. Without additional racks, we didn't have enough space to extend our fiber optic network. As a part of a new building project, we ran 288 strands of single mode fiber cable across campus and needed a place to terminate that new cabling.
  • The "blue box" that was the building's internal cabling was a mess. The word spaghetti doesn't come close to describing the morass of 150 patch cables run from that blue box to switches in the "network core" area.
  • UPSs that were almost at capacity.
  • No ability to easily disconnect and move the UPSs.

Improvements to create the "after" server room

Throughout the past few months, as time has permitted, my data center guy and I made a bunch of changes designed to correct some of these problems. Figure B is a look at the new layout. Figure B

A newly fixed up server room
Moved server racks after eliminating one First, we eliminated one server rack and turned the remaining two server racks (Figure C and Figure D) so the rear air flow was not impeded by a wall. This gives us ample room to work behind servers without worrying about claustrophobia setting in.

We were able to eliminate a full rack by following these steps:

  • Virtualizing older, larger, more power-hungry systems to fewer smaller blade-based systems with a SAN backing the entire thing. This has reduced the number of physical servers and reduced power consumption to a point where our UPSs are happy again.
  • Not leaving 1, 2, or even 3U of space between our systems anymore. There's no point in that for servers with a front to back air flow, so why waste the rack space?
  • Moving our backup server and tape library to another secure campus location where it has a direct fiber connection back to the core.

As we walk by the sliding glass door (yeah, a sliding glass door -- it's not my favorite entryway, and it's something we'll change in the future), we can now look for bad lights at a glance.

Figure C

Figure D

Improved cold air distribution In Figure B, you'll see green lines and boxes connected to the A/C unit (you can see the unit in Figure E); these are duct extensions that we installed in order to drop cold air where it's needed rather than just allowing cold air to blow randomly around the room. This move has already made an impact. Our Dell M1000e blade chassis fans used to run at a very high rate to keep the chassis cool. With this move, the fans run noticeably slower, meaning that the unit is having an easier time staying cool. Figure E

Added two new network/cabling racks In order to accommodate a large, new single mode fiber installation, we added a new rack (Figure F) to the left of the existing network rack and moved the core switch and all the network electronics to the new rack and left the older rack in place to house fiber optic terminations for the campus network. We've added quite a bit in the way of network gear, and we don't need to worry about having enough space to house it now.

Behind the existing network rack, we've added a new rack (Figure G) intended to house all of the cabling for the building itself and get it away from the core network switch. The 150 patch cables that run into the core network rack seriously get in the way and are a pain in the neck to manage. We added the new rack when we renovated the IT office space across the hall and needed to re-run all of the network cabling into the space. This summer, we'll extend the existing patch panels to new patch panels in this common building rack and move the switches that service the building to this rack in order to be able to more easily work on the core switch. Although we don't have to do a lot with the core network rack, we want to make it easy work on it when the time comes. Figure G

Added lighting

We've added a couple of lights to darkened areas in order to improve our ability to work. It's amazing what the simple addition of a couple of lights can do.

Added a cable tray If you look carefully at Figure B, you'll see a sort of thatched area on the diagram; this denotes a new suspended cable tray (Figure H) that we added at about 7' high in the room, or about 6" below the ceiling. We've also removed the patch panels that previously served the servers and instead laid the server connecting patch cables into this easy-to-reach cable tray. This step has made it much easier to make changes and add new cables as necessary; it has also made tracing cables a lot easier. Figure H

Cut the cord on the UPSs

In order to make it possible for us to more easily do routine maintenance on the UPSs and to move them if it became necessary, we had an electrician install massive power plugs in the end of the UPS cord. Prior to doing this, the UPSs were hardwired right into the electrical panel, so maintenance was more difficult.

Summary

We didn't make radical changes to the server room, but by correcting what were some relatively serious problems (particularly with air flow), we did create a very functional space. Westminster College is looking at the possibility of a new academic building, which may afford us the opportunity to build a true data center down the line.

Want to keep up with Scott Lowe's posts on TechRepublic?

About

Since 1994, Scott Lowe has been providing technology solutions to a variety of organizations. After spending 10 years in multiple CIO roles, Scott is now an independent consultant, blogger, author, owner of The 1610 Group, and a Senior IT Executive w...

9 comments
WillGarcia
WillGarcia

Cable tray surely made the cables and wiring look better. It also makes new installations more easy with this.

mike.panagos
mike.panagos

I bet you look forward to getting those bundles out of the way of your core switch. It reminds me of a rack that we had that was a such a huge mess that I cringed every time I went into the closet [pics: http://www.grimadmin.com/article.php/idf-wiring-rack-cleanup ]. After a proper cleanup the difference was so profound. Currently, we're having some cooling issues with a large UPS unit so that's my next project (of many)!

milpo2717
milpo2717

Got a little cabling nightmare there... With a rainbow of patch cabling... Highly suggest you look at www.neatpatch.com to fix that. We use that exclusively in our hospital network racks. Big difference, sir. JMM

darrell.hixon
darrell.hixon

What happens when your single AC unit fails - your server room is going to cook! From a security aspect your cabinets don't have any side panels or lockable doors - bit of a security risk! Mine you if you did have doors on your cabinet your comms room servers would probably over heat as there would then be no air flow through the cabinets. Fire surpression - sprinklers or what you should have is an FM-200 style fire surpressant system - you don't want your server room getting get!

JCitizen
JCitizen

One of my favorite things is getting to re-engineer all the MDFs and IDFs at an organization. I used to be an industrial robot repairman, so nuts, and bolts are fun for me. Having some carpentry skills goes a long ways too! Funny thing - after designing the equipment layout for ease of access, it never fails. You have fewer failures! You get to spit in Murphy's face every-time!

edwardwstanley
edwardwstanley

Just a tip, the duct should be wrapped in some insulating material to avoid condensation.

tsnow
tsnow

Hi Scott, I don't know if the rack panels were left off for the photos but you will likely find that putting the side and front panels back on your racks will also allow your systems to run cooler. The cool air and warm air will not mix as much and that should allow more of the cold, A/C air to move through the servers from front to back. If you can get exhaust vents to pull hot air away from the back of the rack, this would also improve A/C performance.

Pazman
Pazman

It's hard to tell for sure from the picture, but it appears that the HVAC intake is located near the floor. To improve HVAC efficiency you want the hottest air possible coming into the system. This could be accomplished with a little duct work to extend the intake up to the ceiling area where hot air naturally rises. This would help to reduce energy costs and improve the overall cooling of the room. Taking that a step further would be some sort of hot-aisle containment on the backs of the racks, but that could get expensive. A simple extension of the intake to the ceiling area rather than the floor wouldn't cost much. Additionaly, you could extend the intake to the plenum space above the drop ceiling, and then replace the ceiling tiles directly behind the racks with plastic grids. This would allow the hot exhaust from the racks to naturally rise into the plenum space.

dokai
dokai

Great article, Scott! I'm sure more than one of us will be inspired to use our free time (HA!) to re-engineer some problems that have been needing attention for a while. I've never once forced myself into taking on a similar project without asking myself afterward why I didn't do it sooner. Also, "organic" growth is the bane of IT - without a plan, you're doomed.

Editor's Picks