||Add To My Personal Library
November 23, 2007
Vol.29 Issue 47|
Page(s) 26 in print issue
Cooling Down The Data Center
A Multithronged Approach Will Get The Job Done
Component makers continue to develop more energy-efficient processors, power supplies, and other components that allow servers to consume less power and to run cooler. Yet server thermal problems are still not going away anytime soon. Server downtime, data loss, and destroyed components due to overheating are potential nightmares that IT admins will continue to face.
SMEs also often struggle with stricter cost constraints when it comes to investing in cooling solutions compared to larger companies. Yet SMEs cannot afford not to invest in proper cooling systems to prevent thermal issues from destroying equipment and data, either. The good news is that there are several steps and technologies SMEs can use to prevent data center thermal issues from getting out of control.
Maintaining a suitable environment for information technologies is arguably the No. 1 problem facing data center and computer room managers today, says Michael Petrino, vice president of PTS Data Center Solutions (www.ptsdcs.com). Dramatic and unpredictable critical load growth has levied a heavy burden on the cooling infrastructure of these facilities, making intelligent, efficient design crucial to maintaining an always-available data center.
A Cool Environment
Its a simple concept: Server room temperature control largely involves directing cool and warm air to go where it should. Cool air needs to feed the servers components while hot air needs to be piped out of the server room. CRAC (computer room air-conditioning) systems should serve as the source of the cool air that is fed into the data center and the hot air that is taken out.
One way to efficiently channel cool air involves the hot aisle/cold aisle approach when the flow of cold and hot is alternated between server rows. With this system, cooled air is piped into cold aisles. The servers are positioned so their front air intakes face the cool-air aisles while the backs of the servers, where fans pipe the hot air out, face the hot aisle from which the heated air is eventually piped out of the data center through the CRAC.
For many admins, especially those with several rows of rack servers, hot aisle/cold aisle systems are a fundamental element of their data center environment. [Hot aisle/cold aisle layouts] are not only useful, they are essential to the proper and efficient cooling of servers and equipment in data centers, regardless of the data center design, says Jeff Lowenberg, vice president of facilities for The Planet (www.theplanet.com), which offers IT infrastructure services for Web hosting and other applications.
Hot aisle/cold aisle layouts can also accommodate data centers that do not necessarily have the latest cooling systems in place, Petrino says. [Hot aisle/cold aisle layouts] work with traditional perimeter and row cooling, as well as newer in-row cooling deployments, he says. This approach also applies in raised-floor and nonraised-floor environments.
But even with an ultra-efficient hot aisle/cold aisle layout in place, precious cool air is often misdirected or wasted. The wasted cool air is known as bypass airflow and is a very common data center problem. One way to mitigate the effects of bypass airflow is through the use of blanking panels and sealing cable outlets.
[Bypass airflow] can be found at every cable penetration through the raised floor that is not properly sealed with a brush-grommet, Petrino says. Blanking panels offer separation to cause the cold air to remain in the cold aisle and to reduce air mixing.
Directing airflow so that the hottest air is channeled back to the air inlet of the CRAC units does a lot to boost the overall efficiency of a data centers cooling processes, Lowenberg says. Sealing all areas—really anything and everything—to eliminate bypass airflow ensures all cold air is directed into the cold aisles, which is the only place in the data center it is needed, he adds.
The Monitoring Factor
A monitoring system should serve as a key indicator of the efficiency of a data centers airflow and the temperature of the servers intake and outtake airflow. Lowenberg says his firm measures the air inlet temperature of its servers and adjusts the CRAC units so the air temperature is no colder than necessary.
The [CRAC unit] settings are vital. In a data center, the most important measurement is the temperature of the air entering the servers, so temperature monitoring and control are essential, Lowenberg says. If its too warm, equipment will not function properly, which leads to increased failures and shortens the life expectancy of the equipment. If its too cold, our company is wasting money.
Once a system is in place that monitors each row and cabinet cooling zone, it is important to note that server temperatures are by no means static, which makes monitoring an ongoing process. A given is that once effective cooling performance is established for a particular load profile, it will change rapidly, Petrino says. It is important to compile trending data for all environmental parameters for the site such that moves, adds, and changes can be executed quickly.
Hunting Dust Bunnies
Proper data center cleaning by removing particles and contaminants is necessary to prevent a meltdown of server components. Left unchecked, dust particles from the floor and elsewhere will eventually blanket components and trap heat inside servers. The dust buildup clogs outgoing data center filters, which blocks outgoing heat. But you dont want to rely on your buildings custodian to clean the server room with a mop and bucket. This is where specialized data center cleaning service providers come into play.
Removing dust and other contaminant buildup does not involve much more than selecting a competent and reliable service provider. It is just a matter of scheduling data center cleaning services by a qualified third party on a regular basis.
Kevin Vickery, president of data center cleaning service provider ProSource Mission Critical Services (www.team-prosource.com), says, As heat obviously continues to be a major issue, it does tie in to how people should be [using professional services] for data center cleaning, while those who are not yet doing it should be doing it.
by Bruce Gain
Cool Products |
Atlas Sales & Rentals
The Classic or Office Pro cooling products can accommodate capacities from 1 to 5 tons and are geared for computer rooms, as well as for offices. Some of the models can offer 18,000 BTUs of cooling power.
(510) 713-3313; www.atlassales.com
The EJ-CF-168 offers 80mm cooling fans with 3,500rpm for heat dissipation; its noise level is 35dBa and voltage rating is 5V/450mA.
The Websensor EM01B, which connects directly to an Ethernet network, monitors temperature and relative humidity in server rooms; it can also check battery supply voltages of UPS units.
Rackmount Solutions offers a temperature and overheating monitor that managers can read either remotely or locally; the system can be mounted separately or built in to a 19-inch rackmount filler panel.
(972) 272-6631; www.rackmountsolutions.net
Cooling Tips |
Think about a hot aisle/cold aisle layout as soon as rackmount servers are installed in a data center.
Mitigate bypass airflow with blanking panels and sealing cable outlets.
Installing a proper monitoring system is crucial to determine the efficiency and the temperature of the servers intake and outtake airflow.
Data centers require professional server room cleaning services at least once a year to prevent components from overheating due to dust and contaminate buildup.