X

AWS proposes liquid cooling for AI servers at re:Invent 2024

AWS is making significant changes to its data center strategy, emphasizing innovative Liquid cooling technology to meet the rising demand for AI...

Liquid Cooling

Image Credits: canva

AWS is making significant changes to its data center strategy, emphasizing innovative Liquid cooling technology to meet the rising demand for AI workloads. This statement came ahead of Amazon’s main cloud computing conference, AWS re:Invent 2024, which will take place in Las Vegas.

Liquid Cooling For AI Servers

Adding liquid cooling to AWS’s AI servers is a major component of the company’s new approach. The company will use this technology to create custom Trainium processors as well as Nvidia accelerators. AWS highlights that its forthcoming Trainium2 chips and AI supercomputing solutions, such as the NVIDIA GB200 NVL72, will be cooled by liquid systems. This will improve the performance and efficiency of intensive AI activities while also providing a customizable cooling solution for a variety of workloads.

Cooling Design: Multimodal

AWS is also launching a multimodal cooling concept that combines air and liquid cooling. While certain servers, such as those used for networking and storage, may not need liquid cooling, others will benefit from increased efficiency. This design is adaptable to satisfy the needs of both traditional computing and AI models.

Simplified Server Design for Higher Efficiency

In addition to liquid cooling, AWS is simplifying the electrical and mechanical complexity of its servers and racks. The modifications will boost infrastructure availability to 99.9999%. AWS’s more simplified design reduces the number of racks affected by electrical problems by 89%, resulting in more dependable and efficient data centers. These adjustments are likely to involve the use of DC power, which reduces energy losses during the conversion phases.

Energy-Efficient and Sustainable

Prasad Kalyanaraman, AWS’s Vice President of Infrastructure Services, underlined that these new developments would improve energy efficiency while also facilitating the growth of generative AI applications. The modular architecture of these systems also enables AWS to upgrade existing data centers with liquid cooling. These modifications are designed to reduce AWS’ carbon footprint and strengthen its sustainability efforts.

Encouraging Future Growth

AWS’s new multimodal cooling system and upgraded power delivery systems are intended to manage a sixfold increase in rack power density over the next two years, with plans for a threefold rise thereafter. The company is also using AI to optimize rack layout, resulting in more efficient power utilization throughout its data centers.

Collaborating with Nvidia.

According to Ian Buck, VP of Hyperscale and HPC at Nvidia, liquid cooling will assist manage the power demands of AI workloads more efficiently. The collaboration between AWS and Nvidia will ensure that difficult AI activities are carried out with remarkable performance and energy economy.
With these enhancements, AWS lays the groundwork for more powerful, energy-efficient, and sustainable AI computer architecture.

Suggested Articles:

Canadian News Outlets File Lawsuit Against OpenAI

Meta to Invest $10B in Global Subsea Cable Project, Sources Reveal

Written by Varinderjeet Kaur
Passionate Blogger, skilled SEO Executive, and innovative Digital Marketer
Profile  

Leave a Reply

Your email address will not be published. Required fields are marked *