Five keys for data center energy savings

Retrofitting a data center is about managing limitations and trade-offs. Decision-makers have to consider physical limits (such as the weight a floor will support and how much cooling equipment can fit into an existing space). Then there’s infrastructure to think about: It would be difficult to swap out an old uninterruptible power supply (UPS) cable for a brand-new one. Such restrictions have an impact on energy efficiency too: Existing UPS cables generally operate at 85 percent efficiency, whereas the newest ones are in the range of 97.5 percent. To reach the highest efficiency numbers, you’d need to change your entire data center architecture, which is impractical for most companies.

Retrofitting a data center to make it more energy efficient has its restrictions, but doing so can be less costly than having to rebuild an entire facility. To weigh the variables—and achieve energy cost savings – you need to know what’s broken. Here are five tips for determining the efficiency of your data center and how to make it green as can be.

1. Get to know your data center.

An energy efficiency assessment from someone who specializes in data centers should be a priority, says Neil Rasmussen, CTO of American Power Conversion (APC), a provider of data center power and cooling equipment. IBM, EYP Mission Critical, Syska Hennessy, APC and Hewlett-Packard offer such services.

HP recently added Thermal Zone Mapping to its assessment offering. This service uses heat sensors and mapping analysis software to pinpoint problem areas in the data center and helps you adjust things as needed, says Brian Brouillette, vice president of HP Mission Critical Network and Education Services. For example, the analysis looks at the organization of equipment racks, how densely the equipment is populated, and the flow of hot and cold air through different areas of the space. It’s important to place air-conditioning vents properly so that cool airflow keeps equipment running properly, without wasting energy, says Brouillette.

2. Manage the AC: Not too cold, not too hot, but just right.

Energy efficiency often starts with the cooling system. “Air conditioners are the most power hungry things in the data center, apart from the IT equipment itself,” says Rasmussen. If your data center is running at 30 percent efficiency, that means for every watt going into the servers, two are being wasted on the power and cooling systems, he says. To reduce wasted energy, one of the simplest and most important things you can do is turn on the AC economizers, which act as temperature sensors in the data center. According to Rasmussen, 80 percent of economizers are not used, just as IT administrators often turn off the power management features in PCs. It’s also important to monitor the effects of multiple air-conditioning systems attached to a data center; sometimes, Rasmussen says, two AC systems can be “out of calibration” one sensing humidity is too high and the other sensing it'stoo low; their competition, like a game of cooling tennis, can waste energy.

Richard Siedzick, director of computer and telecommunications services at Bryant University, uses such features in his data center. “If the temperature rises to a certain level, the AC in that rack will ramp up, and when it decreases, it will ramp down.” The result is a data center climate that few are used to. Instead of being met with an arctic blast at the door, Siedzick says people have told him his data center is too warm. That’s not actually the case: AC economizers help cooling stay where it is needed, rather than where it is not. And that means increased efficiency and monetary savings. “We estimate we've seen a 30 percent reduction in energy [in part, due to more efficient cooling] and that translates into $20,000.” Siedzick says other precision controls, such as humidity sensors, are used in the data center as well.

3. Place equipment in the right spot.

Most data center floors are raised and tiled. Tiles should be located near the air inlets of IT equipment, not near the exhaust. Since the exhaust areas (where the air is coming out) run hotter than the inlets, making sure tiles (which provide ventilation) are located in the right place makes the AC units run more efficiently. Also, make sure you have the right number of vented tiles in your data center. If you have too many or too few, efficiency goes down.

4. Mind the gaps. Eliminate nooks and crannies.

Many racks in data centers contain gaps, either as a result of extra space or equipment that has been removed. Whatever the reason, it makes airflow unnatural, and that’s bad for efficiency. “The exhaust air can go back through the intakes of the equipment, which makes you have to run the AC colder,” says Rasmussen. The answer: blanking panels. Installing these panels onto server rack cabinets are a way to make the air flow in a data center more efficient.

Many people forget to install blanking panels, even though server manuals from OEMs mandate their use. But Rasmussen says they are inexpensive (sold 100 to the box, in some cases) and easy to install.

5. Can it get hotter in here?

Once you’ve done everything listed above, check to see if you can run the air-conditioning at a higher temperature. Rasmussen says that most units are set at around 55 degrees and some get as low as 45 degrees. The lower the temperature, the less efficient your data center is. “You should run that AC hot as you can without the servers overheating,” Rasmussen says. He says 68 degrees is a good target, but unless you are operating a brand-new data center with a top-notch design, you are unlikely to hit such a number.

If you follow the rules above, Rasmussen says it’s likely you can increase the temperature to 55 or 60 in a less-than-new building.

Previous ArticleNext Article

Leave a Reply


The free newsletter covering the top industry headlines

Send this to a friend