Companies looking for a green data center model should take a look at the new facility Internap built in Somerville, Mass., just outside Boston, which is so environmentally efficient the local power company wrote it a rebate check for $453,000.
A renovated warehouse building that most recently housed a 5,000-member church, the collocation facility is optimized to economically meet demands for cooling, humidity and power consumption that are common to all data centers.
Internap expects that the data center will save another $400,000 every year by using less power than it would have had it not built to green specifications, says Mike Higgins, vice president and general manager of Internap's data center services. That is helpful to its bottom line as well as keeping down rates it charges customers, he says.
To meet increasing demand for power in data centers, the facility is designed to provide 150 Watts per square foot average, a significant boost to the 60 Watts per square foot specification widely used a few years ago.
Data center power consumption in the United States doubled between 2000 and 2006, according to the Environmental Protection Agency, so paying attention to efficient use of power helps address wider concerns about the ability to provide enough electricity to meet general demand. Gartner says that by 2011, power consumption could double again from 2005 levels if businesses don't take steps to make data centers more efficient.
Efficiency and reliability at the Internap data center are key to Carbonite, the first customer to rent space in the facility. Carbonite is a venture-backed start-up that sells cloud-based backup services to consumers and soon to small businesses.
The design of the Internap site gave Carbonite confidence that its environmental needs – power, cooling, humidity – would be met, says Rob Rubin, vice president of engineering for Carbonite. Heat and humidity have a direct impact on how long disk drives last, he says, and the better those factors are controlled, the longer their storage disks last, he says. That has a direct impact on Carbonite's bottom line.
The 45,000-square-foot Internap building has 16,300 square feet of usable data-center space built so far and has a separate section ready to build out into a second space of the same size, according to Karl Robohm, principal with Transitional Data Services, which consulted on the project.
One key to efficiency was deciding well ahead of time what equipment to install and only then turning building design over to architects and engineers, says Robohm. That meant taking the time to find which gear would best serve efficiency goals. “We didn't pay any more money, we just went out and did some research,” he says.
The first step was finding the six rooftop cooling units that keep temperatures in the data center space at 70 degrees Fahrenheit, plus or minus 2 degrees. He looked at gear from six vendors but found only one that met the rebate specifications of the power company, NSTAR. NSTAR has a program for paying rebates to its customers that use energy-saving infrastructure in their new construction.
Internap saved money by buying the units itself rather than having the contractor buy them at a markup, he says. That also ensured that they would be on hand so construction wouldn't grind to a halt waiting for them to be delivered.
The units, which have traditional cooling coils, were installed with the idea of also installing alternative water-cooled chiller plant later, says John Willard, president of Complete Energy Solutions, another consultant on the project. That way Internap can use the cooling method that will be more efficient depending on the outside temperature. “It's green today, but it can get even greener in the future,” Willard says.
The simple choice of roof color saves money as well. Making it white reflects more light, which means the roof is cooler and doesn't generate heat that seeps into the building just to be pumped out and cooled.
Also in the cooling mix are enormous vents called economizers that can let in outside air as part of the effort to keep temperatures down in the data center space. The economizer can also be used to eject hot air directly out of the building rather than cooling it through the rooftop units. But proper use of economizers requires careful calculations, Robohm says.
Cool outside air has relatively low humidity and low humidity encourages build up of static electricity that can wipe out servers. So using the economizers has to be coordinated with generating humidity within the building to keep down static charges, Willard says.
To handle this calculation, the facility has an energy management system (EMS) that he calls the brains of the building. The system's sensors measure inside and outside temperature, inside and outside relative humidity and air pressures within the cooling ducts and the space below the data center floors where the cool air is delivered. “You have to have the whole building thinking as one,” Willard says.
Preventing inside humidity from dropping below 40%, plus or minus 5%, falls to ultra-sonic humidifiers that generate a cool mist. The alternative way to provide the moisture is use of steam canisters that heat water to create steam that humidifies the air.
The savings of the cool humidification system are dramatic, using 93% less energy than the steam gear, Willard says. In another Internap data center facility that formerly used somewhere between 90K Watt and 135K Watt to power steam humidifiers, switching to the ultra-sonic technology cut that number to 9K Watt, he says.
The ultra-sonic gear doesn't use energy to heat the water; it's sprayed out at the temperature it arrives from the tap. The mist actually absorbs some of the heat generated by electronic gear in the data center, and because it's mist not steam, doesn't add heat to the equation.
To improve the efficiency of humidity control, the entire inside of the building was sealed by plugging up obvious air spaces and spraying a vapor barrier on the walls so moisture doesn't escape through them. “You want the air to go where you want it to go,” Robohm says.
Internap wants the cool air pushed from the roof to a three-foot space below the data center floor, where air pressure forces it up through vents in the flooring. The vents are evenly spaced in rows that allow placement of standard-sized equipment cabinets between them, creating cold aisles.
The front sides of the servers face these cold aisles and their fans draw the cool air across their heat-generating components. The heated air is pushed out the back to hot aisles that have return gratings in the ceiling directly over them. The gratings lead to a six-foot space above the ceiling where the hot air is ejected or sucked into the rooftop units for cooling.
The Internap facility further reduces heat generated in the building through use of harmonic-mitigating transformers – transformers that generate less heat than conventional transformers as they convert AC power to DC power.
Even the lighting in the facility is low powered and turns on with motion sensors. If customers want more light for their space within the data center, they can add it themselves. The company has plans to install even more efficient lighting that will earn it another rebate from the power company and reap 10% to 15% savings in power, Willard says.
Overall design of the facility makes it possible to have a wide-open data center floor free of any gear needed to control heat and humidity and so generate heat within the space, Robohm says. Freestanding cooling units also eat up floor space for their footprints plus buffer zones around them.
Internap retrofits some of its other eight data centers with technologies used in its new facility, but it is difficult to make wholesale changes in operating data centers that are full of customer gear, Higgins says. But the lessons learned at the Somerville location will be used as the company builds its future sites, he says.