||Add To My Personal Library
July 2, 2010
Vol.32 Issue 14|
Page(s) 43 in print issue
â€˜Greeningâ€™ The Data Center
Add Some Teeth To Energy-Saving Measures
The EPA reported a few years ago that data centers in the United States consumed about 60 billion kWh (kilowatt hours), or about 1.5% of all U.S. electricity consumption in 2006. Since then, the EPA says data center power consumption is on track to double by 2011 to more than 100 billion kWh, for a total energy bill of $7.4 billion annually.
However, data center managers are not powerless to reverse this trend and are often in the position to use technology to slash power consumption while boosting server and network efficiencies. Here are some ways to make your data center more environmentally efficient, which, if applied on an industry-wide scale, would offset the EPA’s less-than-optimistic predictions about data center power consumption in the United States. In addition to covering ways to reduce power, some of the tips show how data centers can be more self-contained or reduce their environmental footprints in other ways.
Some new applications that are supposed to make data centers “greener” might seem unusual from the outset, but they offer promising results and could merit serious consideration in the future. For example, HP has experimented with using energy that cow manure releases to power large data centers. The HP researchers say a farm of 10,000 dairy cows can produce enough manure to power a 1-megawatt data center as well as meet other power requirements for the farm.
The researchers have also demonstrated that heat released from data centers can be used, when combined with cow manure, to generate methane gas, which in turn can serve as a potential power source. The ability to apply the excess heat from the data center to manure to produce more energy in itself represents “the key thing” compared to just using cow manure for data center energy, says Bill Kosik, energy and sustainability director for critical facilities services at HP (www.hp.com). “This is like the Holy Grail in data centers for what to do with the waste heat,” Kosik says.
Still, it will be a while before farmers and data centers have access to affordable infrastructure to make such a project happen, Kosik says. “To a certain degree, this system does not represent a huge practicality yet,” he says.
Another example of using alternative energy sources that can be applied in the future include the use of geothermal energy from a volcano to power turbines that then generate electricity for the data center, which is already being done in Iceland, Kosik says.
“There are no fossil fuels used or CO2 emissions [associated with geothermal energy],” Kosik says. “There are also huge geothermal resources all over the United States.”
Use CFD Modeling
Although it’s often considered an advanced tool for many small to medium-sized enterprise data centers, CFD (computational fluid dynamics) modeling can go a long way to ensure that hot and cold air channels are flowing in the directions they should so that ultimately more efficient airflow management will translate into power savings and more ecologically friendly practices. “A cooling assessment, specifically a [CFD] analysis, can help data center managers leverage the current cooling capabilities and predict future data center requirements,” says Ben Kissell, service solutions manager at Emerson Network Power’s Liebert Services (www.liebert.com). “A CFD can pinpoint the airflow’s stream throughout the data center as well as locate hot spots and other cooling challenges. Using the data supplied by the CFD model, data center managers can also identify key strategies such as redundancy or high-density cooling to ensure optimum data center availability.”
Turn Down The AC
For many data centers, cooler means better. However, walking into a data center that is kept too cool probably represents a waste of energy. Instead, running data centers at temperatures close to maximum thresholds translates into a much more ecologically friendly alternative—and it also leads to huge power savings. Data center managers also have more leeway to raise temperatures because ASHRAE (American Society of Heating, Refrigerating and Air-Conditioning Engineers) maximum temperature ranges are higher than what they were a couple of years ago. “Many data centers are overcooled where ambient temperatures well below 75 F are maintained,” says Nik Simpson, an analyst for the Burton Group. “ASHRAE recommends something closer to 80 F.”
Check The UPS Units
It may turn out that a data center’s UPS units are consuming far more power than they should. Replacing and/or adding UPS units can go a long way to save energy, thus dramatically reducing a data center’s environmental footprint.
“Old UPS equipment may be contributing as much as 20% to your power bill, particularly if the UPS is 10 or more years old,” Simpson says. “Monitoring the power distribution network to understand where the losses are is important. For example, there’s no sense in spending $50,000 to put in more efficient PDUs to gain 3% when you could upgrade the UPS and gain 15%.”
by Bruce Gain
Best Tip: Use Nature’s Air |
It sounds almost too good to be true, but for some data centers, ambient air coolers and humidifier controls are not even necessary during most of the year when outside air can be used for cooling. During cold days in the fall and winter in most areas in the United States, for example, why not just devise a system to pipe in cool air from outside to maintain temperatures and humidity levels within the ASHRAE standard guidelines? Such a scenario offers major environmental benefits, of course, in addition to electricity costs saved.
The possibility of using outside air is increasingly feasible, while data centers exist today that are designed to run 90% of the year with no mechanical cooling, says Bill Kosik, energy and sustainability director for critical facilities services at HP (www.hp.com). “This is a next wave [in data center cooling],” he says. “The [outside] temperatures, as well as the humidity outside, are a big, big part of saving energy for cooling.”
Most Practical Tip: Put Memory Modules Back Into The Market |
Memory modules are often replaced and discarded. Unfortunately, many modules end up in landfills and add to environmental waste. However, as an alternative, memory modules can be sold to third parties that then sell the components for further use. “Data centers are always upgrading the memory modules in the server racks, and the excess memory that is left over may sit in a closet [or end up later in landfills]. If that memory is sold as a [working component], it can be reused, and the money from the sale could be used for projects lacking budgets at the corporation,” says Jeff Bittner, president of SMS Memory Module Assembly (www.smsassembly.com). “This is a greener way of disposing of the memory. If [one] waits too long, the memory will become obsolete and end up in a landfill or recycled before its time.”
Bonus Tips |
Use point cooling for hot spots. Instead of blindly cooling the entire data center to keep the hot spots below a certain range, targeted cooling is much more environmentally efficient. “There is a tendency to address hot spots by setting the overall temperature lower,” says Nik Simpson, an analyst for the Burton Group. “It’s better to introduce in-row/in-rack cooling solutions to deal with hot spots and leave the overall temperature at a higher setting.”
Use liquid cooling. Though liquid cooling is nothing new and direct liquid cooling of racks might be difficult to implement, at the end of the day, it is a more environmentally friendly way to cool racks. “Water or other refrigerant liquids are much more effective at removing heat than air is,” Simpson says.