What are you looking for ?
Advertise with us
PNY

AI Day: HPE with 100% Fanless Direct Liquid Cooling Systems Architecture

Delivers benefits including 37% reduction in cooling power required per server blade, when compared to hybrid direct liquid cooling alone.

Hewlett Packard Enterprise LP (HPE) announced a first 100% fanless direct liquid cooling systems architecture to enhance the energy and cost efficiency of large-scale AI deployments.

Hpe Liquid Cooling 3The company introduced the innovation at its AI Day, held at one of its state-of-the-art AI systems manufacturing facilities. During the event, the firm showcased its expertise and leadership in AI across enterprises, sovereign governments, service providers and model builders.

Industry’s first 100% fanless direct liquid cooling system
While efficiency has improved in next-gen accelerators, power consumption is continuing to intensify with AI adoption, outstripping traditional cooling techniques.

Hpe Liquid Cooling 1Organizations running large AI workloads will need to do so more efficiently. The most effective way to cool next-gen AI systems is through direct liquid cooling, of which HPE is a pioneer. This critical cooling technology has enabled the firm’s systems to deliver 7 of the top 10 supercomputers on the Green500 list, which ranks world’s most energy-efficient supercomputers.

Based on this expertise, the company’s 100% fanless direct liquid cooling architecture, introduced, brings the cost and energy efficiency benefits sovereign AI deployments are already enjoying to a broader set of organizations building large-scale GenAI.

As organizations embrace the possibilities created by generative AI, they also must advance sustainability goals, combat escalating power requirements, and lower operational costs,” said Antonio Neri, president and CEO. “The architecture we unveiled today uses only liquid cooling, delivering greater energy and cost-efficiency advantages than the alternative solutions on the market. In fact, this direct liquid cooling architecture has the potential to yield a 90% reduction in cooling power consumption as compared to traditional air-cooled systems. HPE’s expertise deploying the world’s largest liquid-cooled IT environments and our market leadership spanning several decades put us in an excellent position to continue to capture AI demand.”

System architecture is built on 4 pillars:

  • 8-element cooling design that includes liquid cooling for the GPU, CPU, full server blade, local storage, network fabric, rack/cabinet, pod/cluster and coolant distribution unit (CDU)
  • High-density and performance system design, complete with rigorous testing, monitoring software, and on-site services to support successful deployment of these sophisticated compute and cooling systems
  • Integrated network fabric design for massive scale integrating lower-cost and lower-power connections
  • Open system design to offer flexibility of choice in accelerators

Click to enlarge

Hpe Liquid Cooling 2

The 100% fanless direct liquid cooling architecture delivers benefits – including a 37% reduction in cooling power required per server blade, when compared to hybrid direct liquid cooling alone. This reduces utility costs, carbon production and data center fan noise. In addition, because systems using this architecture can support greater server cabinet density, they consume less floor space.

Leadership and market opportunity
At AI Day, Neri, Fidelma Russo, EVP and GM, hybrid cloud and CTO, and Neil MacDonald, EVP and GM, server discussed how the HPE portfolio comprises the critical building blocks of networking, storage and hybrid cloud to deliver on the promise of AI.

Resources:
Blog:
Showcasing our AI leadership and expertise at HPE AI Day, by Neri
Blog: Liquid cooling: a cool approach for AI, by Jason Zeiler, liquid cooling product manager, HPE

Articles_bottom
ExaGrid
AIC
ATTOtarget="_blank"
OPEN-E
RAIDON