Your server room may be perfect; every Ethernet cable in place, all KVM and power cables neatly tucked into the sides of the racks. Your comm rack may be a marvel of modern cable management. Unfortunately, not all server rooms are so lucky.
Some server rooms are nothing but living examples of bad planning and worse execution. Dare I say, viewing some are even trips to the legendary 11th level of Hades itself (it's where the Apple Netwon and Microsoft BOB are). We now embark on an adventure to one such server room. If you dare?
Submitted by Joshua Hoskins
Server rails are a great thing. They allow you easy access to your server for any needed hardware maintenance. What could be better than a nice rail kit you ask? Why stacking your servers directly on top of each other!
Image the surprise and fun you'll have when you slide out a server to add a stick of RAM only to have two other servers topple out on top of you! Boy, will there be egg on your face then, when a simple RAM upgrade turns into a two server replacement.
Bill Detwiler has nothing to disclose. He doesn't hold investments in the technology companies he covers.
Bill Detwiler is Managing Editor of TechRepublic and Tech Pro Research and the host of Cracking Open, CNET and TechRepublic's popular online show. Prior to joining TechRepublic in 2000, Bill was an IT manager, database administrator, and desktop support specialist in the social research and energy industries. He has bachelor's and master's degrees from the University of Louisville, where he has also lectured on computer crime and crime prevention.
I am surprised, Bill, that on Slide 5 (the one with the CRT on the floor behind the rack) you didn't mention the 15Amp power cable across the access-way.
Of all the things in this computer room that require mention, there are two safety concerns and the rest are best-practice/longer-term operational maintenance concerns.
Safety 1: Power cables in walk-ways (slides 4 & 5)
Safety 2: Incomplete and non-terminated power cables in roof (slide 8)
This room lacks sufficient rack-mounting & cable-management hardware. (There can be legitimate reasons for lack of rack-rail kits - for instance second-hand/re-purposed hardware; custom/incorrectly configured racks; or space concerns)
Of a bigger concern to me is the evidence of shoddy operational processes - mostly demonstrated by the consistently poor cable management and the lack of cleaning - not all the dust is because of the ceiling works.
Is this room really cooled by the building air? (the vent in slide 8) This has more longer-term impact on server reliability than anything else seen here.
This is an example of a poorly maintained computer room, but far from the worst I have seen.
First picture reminds me of a 'pro' server install a few years ago that I was asked to help work on.
Guy got a deal on telco relay racks, but they weren't aligned right for front & back attachments to rails.
Didn't want to be bothered to adjust the racks and re-drill the anchor bolts.
And got the wrong rails (for square-hole mounts) for the screw-only relay racks.
Hundreds of C-Clamps. Put one horizontally at each corner of the rack where a server would sit.
Vice-Grip wrench the clamps in place so they shouldn't move when the server is placed on them.
Which naturally left air gaps between each server.
Can you guess how many C-Clamps are needed to hold an 8U server?
or a 4U SCSI disk chassis?
My sanity was preserved as I ran for the hills.
I wish I still had a picture of this - the first job I had (1968), the computer room was on the ground floor of a 4-story hotel. There were beautiful plexiglass sheets hung horizontally above each of the cabinets to deflect the inevitable water that would drip in when a hotel guest got the shower curtain wrong or overfilled the bathtub.
The simplest way to maintain a proper Datacenter is to now allow Admins to rack equipment. You have dedicated datacenter people that take care of all the racking and cabling. This only works if you have a large enough datacenter that you can have dedicated headcount for this. However it works to keep the Datacenter neat. If you cannot do this, then keeping the Datacenter neat is also a commitment from IT leadership and Admins. Where I currently work the IT manager asked me to clean up the datacenter soon after I first started. When I told him that this would require a budget, he was taken back. He thought it would just involve more time from the Admins. I took him into the Datacenter and showed him. You want fiber run properly, then you have to provide a overhead fiber runs, not just run under the floor. Want your cabling neat in the racks, then use proper racks with cable management. Want your copper run's neater, supply patch cable in 1-foot increments from 3feet to 15feet. Also provide proper storage bins that can be labeled. Invest in rack filler panels Want your power run's better, don't use 10+ year old power strips that cannot even be attached properly to the racks. He was kind of stunned but eventually we go the datacenter cleaned up over a couple of years.
I've been through many companies and IT for maybe 25years. I've never seen a completely "proper" server room except in photos from large companies with lots of money and I'm guessing they were somewhat cleaned up for the photos. The first photo about stacking servers - if they are rack mounted then you are supposed to stack them on top of each other (not just resting the weight on top of another though) for Cooling purposes. APC did a big study that the airflow is better if servers are placed in the adjacent "U" area on racks. Photo # 2 with loops of fiber - well fiber used to be expensive and a proper patch cord is not easy to make like a crimped CAT# ethernet. I'm guessing it was what they had on hand without purchasing new. My point being most all companies try to get by with what they've already purchased and just "deal" with it until the emergency arises. Other than that the photos are funny and I love reading Bill's articles - thanks!
NFPA requires sprinkler piping in all rooms, regardless of what we may or may not want. Even if you have the most expensive awesome FM-200 fire suppression system you still have to have the sprinkler pipe in the room. It's possible to use a dry pipe pre-action style system in the room, but if you break a head your going to set off an alarm and ruin your day, just not your equipment. The only think you can really do, is recess the sprinkler heads, if allowed by code, or put cages around the sprinkler head to prevent accidental discharge.
We have a Halon system for the computer room. The wet heads are there, but they no longer connect to anything. I'm sure I can't be the only one with this configuration.
@CharlieSpencer I have built 5 computer-rooms or small data centres in the last few years. In all cases I have been required (by West Australian building codes) to include sprinklers - even through 3 of the five rooms had gas suppression systems as well.
Using sprinklers in a computer room is not an automatic fail - although it can be if not planned well.
You do need to prepare:
Integration of phase-3 activation from your VESDA (you *DO* have a VESDA dont you?)
Dry-Pipe isolation valve
EPO (Emergency Power Off) for UPS and power distribution
Fire doors and fire-rated walls
Integration for the FIP (Fire Indicator Panel) & labelling for Fire & Emergency services.
Any good civil & Fire engineer will have the relevant information for your jurisdiction.