Go With the Air Flow: Remember the Basics to Keep Servers Running Cool
You’ve evaluated RFQs, compared RFPs, met with vendors and hammered out a deal. You’re finally ready to move out of your old, 2+ PUE enterprise data center into a newer, more efficient colocation facility. All you need to do is rack and stack the servers, patch them in and let the efficiency magically happen, right? Not exactly.
Before you power up the hardware, you need to make sure that the cool air can flow as it should, which will maximize server uptime, extend asset life and meet the conditions spelled out in the SLA. That means paying attention to some basic details, starting at the bottom of the rack, while keeping in mind that almost all IT equipment pulls in cool supply air from the front.
- Determine if you need floor tile grommets – Many colocation facilities have raised floors that are the supply air plenum. If power or communications lines run under the floor as well, you need floor tile grommets. Grommets protect cabling from the sharp edges of the floor tile and – this is the important part for cooling – they block the cold air from escaping, so it can be routed where it’s effective.
- Properly mount the equipment – Yes, we said it. Seems obvious, but we’ve seen equipment put in backward because it made patching easier. Unfortunately, it also positions the equipment to suck in hot air from the back of the panel and blow the hot exhaust air into the cold aisle, a recipe for failure.
- Don’t skip blanking panels – Gaps in the rack let hot air circulate into the cold aisle. It’s tempting to leave out these panels in anticipation of the servers you plan to add “someday soon” or when other tasks are at hand, but they are an important piece of the puzzle. They are easy to remove later when you are ready to add servers.
- Cable management matters – It’s critical to organize point-to-point connections for a couple of reasons. Cables that are bundled together with a Velcro® cable tie and swept to the side don’t impede circulation. This may not seem like it matters in the hot aisle, but unmanaged cables limit the hot aisle flow, causing server fans to run harder and making it easier for hot air to circulate to the cold aisle. This is especially true when blanking panels are not installed. It also makes moves/adds/changes faster, and you won’t create a curtain that obstructs air.
Good SLAs Make Great Neighbors
Your SLA likely says that the colocation facility will provide air to your IT equipment at the ASHRAE (American Society of Heating, Refrigerating and Air-Conditioning Engineers) recommended range of 64.4°F to 80.6°F (18°C to 27°C). SLAs are contracts, with both parties agreeing to some responsibility. A colocation provider should provide an ideal environment for your IT equipment, but ultimately you will need to properly install that equipment to help achieve those conditions, both for your servers’ performance and to be a “good neighbor.” After all, if the shoe was on the other foot, with the neighboring cage forcing the temperature above the SLA standard, you wouldn’t want to pay the price for their poor air flow management.
You’ve put a lot of time and thought into your choice of colocation provider. Now that you’ve made that decision, don’t forget the basics when move-in day comes. You, your provider and your neighbors will all reap the benefits – 100% uptime, easier scale-up and lower costs. In addition, these basic steps will lower energy usage and make a greener contribution.