Hewlett Packard Enterprise (HPE) has unveiled a groundbreaking development in cooling know-how for synthetic intelligence (AI) methods: a 100% fanless direct liquid cooling system. This progressive answer goals to considerably improve the vitality and price effectivity of large-scale AI installations, addressing the rising knowledge middle energy consumption calls for pushed by superior AI workloads.
HPE introduced this new cooling system throughout its AI Day, an occasion hosted at one among its state-of-the-art manufacturing amenities devoted to AI methods. The occasion showcased HPE’s management in AI know-how to a worldwide viewers of companies, governments, service suppliers, and mannequin builders.
This new cooling system builds upon HPE’s established experience in energy-efficient supercomputing, with seven of the highest ten methods on the Green500 listing – rating the world’s most energy-efficient supercomputers – delivered by HPE. The corporate’s direct liquid cooling know-how has performed a vital position in reaching this standing.
Stability Sustainability with AI Calls for
As AI continues to evolve, it calls for more and more highly effective {hardware} to course of bigger datasets and sophisticated fashions. Whereas next-generation accelerators have improved vitality effectivity, the sheer scale of AI purposes has outpaced the capability of conventional cooling strategies, making extra superior options important for efficient system operation. HPE’s direct liquid cooling strategy is designed to satisfy this problem by decreasing the vitality and operational prices related to cooling AI methods, which in flip helps organizations stability their sustainability objectives with the necessity for strong AI infrastructure.
HPE’s 100% fanless direct liquid cooling system marks a major step ahead on this effort. In accordance with Antonio Neri, President and CEO of HPE, the brand new design may cut back cooling energy utilization by as much as 90% in comparison with typical air-cooled methods. “As organizations embrace the probabilities created by generative AI, additionally they should advance sustainability objectives, fight escalating energy necessities, and decrease operational prices,” mentioned Mr. Neri. “Our fanless direct liquid cooling system provides superior vitality and cost-efficiency in comparison with different methods available on the market.”
The structure of this new cooling system rests on 4 key pillars. First, it employs an eight-part cooling structure that makes use of liquid to chill vital parts just like the coolant distribution unit (CDU), CPU, GPU, server blades, and extra. Second, it incorporates a high-density, high-performance system design, making certain that the know-how can help superior computing wants whereas being rigorously examined and monitored for efficiency. Third, the built-in community cloth structure helps large-scale AI deployments by decreasing energy consumption related to community hyperlinks. Lastly, the open system structure permits companies flexibility in selecting accelerators, enhancing scalability and customization.
The 100% fanless liquid cooling system presents distinct benefits over hybrid cooling options. Notably, it will cut back the cooling energy required per server blade by 37%, cuts noise air pollution by eliminating the necessity for knowledge middle followers, and lowers carbon emissions and utility prices. Moreover, this design would enable for increased server cupboard densities, enabling knowledge facilities to save lots of on priceless ground area.
As the usage of AI grows, HPE‘s fanless liquid cooling system would supply a forward-thinking answer that not solely meets the rising computing calls for of AI but in addition complies with environmental targets, making it a becoming alternative for companies trying to successfully deploy AI on a broad scale.