The data center industry is getting hotter in more ways than you may think. “Cooling can require up to nearly 40% of a data center's total power load,” observes Dave Sterlace, strategic account manager, data centers, for power systems firm Hitachi Energy. “It makes a great target for a more sustainable approach.”
Most data centers rely on mechanical air cooling, such as fans, refrigeration systems, and dehumidifiers, to keep their hardware resources from overheating. “This is unsustainable from an environmental, cost, and performance perspective,” says Joe Capes, chief executive officer of cooling systems developer LiquidStack.
As much as half of a data center's total power use can come from air cooling alone, Capes notes. “Rising energy costs make this [approach] increasingly expensive, and the powerful processors required for today's data-intensive technologies … simply generate too much heat for air cooling to handle.”
Forced air isn't particularly efficient at removing heat, and is limited to about 25 kilowatts per rack, Sterlace says. The approach has worked well so far, but with the arrival of artificial intelligence and other high-performance computing technologies with design loads of up to 75 kilowatts per rack, it makes sense to consider a better heat conductor. “The biggest advance we see coming in cooling is to go to liquid cooling media,” he says.
Liquid immersion cooling, which lowers hardware heat by submersion in a thermally conductive dielectric liquid, will replace air cooling in many data centers by necessity, Capes predicts. “This will significantly reduce carbon emissions and water usage, while enabling the high performance compute our digital society requires -- all at a lower cost.”
Liquid immersion isn't the only alternate cooling choice. Depending on a data center's size, load, and location, alternative cooling options include direct evaporative and direct-to-chip cooling technologies. Both technologies typically yield a lower annualized power usage effectiveness (PUE) metric than traditional cooling systems, says Sean O'Shea, an associate principal at engineering firm Syska Hennessy.
Direct evaporative cooling systems are most often used in large-scale data centers. “They control air supply temperature and humidity, serving IT equipment cabinets by using evaporative pads-mesh with a water supply,” O'Shea explains.
Direct-to-chip cooling technology absorbs heat at each server's CPU via a metal plate attached to top of the chip. “The heat is transferred from the chip to the plate via a thermal material,” O'Shea says. Water or a dielectric fluid is circulated through the metal plates, absorbing the chip's heat and ejecting it to the atmosphere via cooling towers, dry-coolers, or other means. “Since this method cools the chip only, additional air cooling for the remaining heat is required.”
Reducing the amount of energy needed to cool data center equipment is key to advancing environmental sustainability, says Nic Kilby, hosting operations engineer at Zengenti, a web content software management firm.
Zengenti's sophisticated web content management system requires storing large amount of data. “Just recently, we've undertaken a project that involved future-proofing our data center, based at our headquarters in Ludlow (England), to make it run most efficiently now and also in the future,” Kilby says. “By cooling our on-site data center efficiently, we've managed to save 25% electricity to date -- we hope to continue to improve, as technology develops.”
Kilby notes that enterprises can achieve similar results through various means, including using efficient cooling systems, implementing hot and cold aisle containment, and tapping into outside air for natural cooling. “It's all about finding what works best for your data center, taking into account location and temperature,” he explains. “We’ve made the most of the cool, windy Shropshire weather.”
It sounds counterintuitive, yet a growing number of data center operators have learned that simply raising the data hall temperature can yield lower energy usage with no impact on IT performance, Sterlace says.
The data center community is beginning to move above the operating temperatures recommended by the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE), notes Drew S. Thompson, associate vice president, data centers and mission critical facilities solutions, at engineering firm Black & Veatch.
Facebook parent Meta, for example, has hoisted its inlet cooling supply temperature to 95-degrees Fahrenheit, Thompson says. “The bigger the delta operating temperature -- the difference between the cooling supply temperature and the return temperature -- the more energy saved,” he says. Pushing the heat envelope even further, IT server manufacturers are preparing to offer high temperature-tolerant servers as the demand for such hardware increases, Thompson adds.
There's no silver bullet when choosing an effective data center cooling technology. “Each data center and client is different,” Thompson says. “Also, typically, when a cooling technology saves energy it (often) uses more water, so true sustainability would need to look at all the resources used.”
As enterprises design next-generation energy-efficient data centers, they will need to involve all relevant stakeholders, including the data center IT management team, the facilities team, design engineers, and general contractors, Thompson says.
Networking and Data Center Horrors That Scare IT Managers
How to Repurpose an Obsolete On-Premises Data Center