Managing AI’s Power Needs and Sustainability Goals

June 7th, 2024 by · Leave a Comment

This Industry Viewpoint was authored by Iceotope’s Dr. Kelley Mullick

Artificial intelligence (AI) is transforming our interactions with technology. From personalized streaming recommendations to autonomous vehicles, it is pervasive and rapidly evolving, and requires support from the digital infrastructure industry to continue its meteoric adoption. This raises a critical question: How are data center operators managing the impact of AI on their operations?

The initial focus has been on capacity, with wholesale space in major global data center markets becoming scarce due to the “land grab” by cloud providers to support AI workloads. However, the urgency of AI’s strain on power infrastructure and the need to meet sustainability objectives is not just a concern, it’s a ticking time bomb. Headlines across Europe, Asia and the US underscore the tension between power, sustainability and data center growth. The International Energy Agency (IEA) predicts that global electricity demand, driven by AI growth, will double by 2026, necessitating immediate and decisive action.

This power consumption surge presents significant challenges for data center operators striving to maintain efficiency, sustainability and total cost of ownership (TCO). AI’s energy-intensive nature exacerbates carbon emissions and increases the carbon footprint of data centers, amplifying environmental sustainability concerns. Cloud Service Providers (CSPs) are particularly focused on optimizing TCO as they navigate the implications of AI on their operations. Similarly, telco operators in Europe and Asia prioritize improving TCO and sustainability while relying on data centers to support AI-driven services.

The power consumption of AI hardware is expected to remain high, with estimates suggesting that AI could account for up to 7.5% of the United States’ projected electricity demand by 2030. Nvidia’s announcement of its 1200W Blackwell GPU, designed for real-time generative AI on trillion-parameter large language models, underscores the need for advanced cooling solutions. As compute density increases, along with the overall rising thermal design power of IT equipment, sustainable cooling solutions are essential.

Liquid cooling systems offer a more efficient way of dissipating heat than traditional air cooling methods. By circulating a coolant fluid directly over the hottest components, heat is rapidly transferred away, maintaining optimal operating temperatures for AI systems. As chips get hotter, data center operators must future-proof their infrastructure investments for 1000W CPUs and GPUs and beyond. Choosing technologies that can meet the demands of future processor and chip roadmaps will be key to not only effectively cooling hotter chips, but driving down energy consumption in data centers in tandem.

The average data center power usage effectiveness (PUE) is about 1.8, according to the National Renewable Energy Laboratory, but can be reduced significantly with a switch from air cooling to liquid cooling since cooling energy can account for up to 80% of the non-IT energy data centers require. With liquid cooling using only 20% of the energy of an air-cooled data center, conversion from air to liquid cooling can take a data center’s 1.8 PUE down to 1.3.

Furthermore, as chip wattage rises, liquid cooling is promising for handling greater power density. Looking at single-phase liquid cooling technology, like Precision Liquid Cooling, testing has shown its ability to exceed the perceived 1000W limit to beat out other cooling technologies.

Initial testing showed that single-phase liquid cooling maintained a constant thermal resistance at a given flow rate as power increased from 250W to 1000W. More excitingly, a second round of testing found consistent thermal resistance up to 1500W – a threshold not yet met within the industry. These results demonstrate that single-phase liquid cooling technology is indispensable for effectively managing the escalating thermal demands of AI workloads in data centers.

Adopting liquid cooling technology is more than a solution, it will soon become a requirement. It enhances operational efficiency, reduces energy consumption and aligns with emerging sustainability standards. While much of the market has yet to reach 1500W operation, it is poised to do so soon. Liquid cooling efficiently dissipates heat from high computational power and denser hardware configurations, addressing the thermal challenges of AI and optimizing performance, energy efficiency and hardware reliability. It is an essential solution for AI workloads and the key to unlocking their future potential.

The intersection of AI, power consumption and sustainability presents challenges and opportunities. Data center operators must adopt advanced cooling technologies, like liquid cooling, to ensure they can support AI’s growing demands while maintaining efficiency and sustainability. As AI continues to evolve, so too must our strategies for managing its impact on our digital infrastructure.

Dr. Kelley Mullick is the Vice President of Technology Advancement and Alliances at Iceotope Technologies where she is responsible for spearheading the advancement of technology initiatives and fostering strategic alliances with leading Original Equipment Manufacturers (OEMs) and technology partners. As a dynamic and results-oriented systems engineer, Dr. Mullick’s primary focus is to drive the evolution of Iceotope’s liquid cooling technology, ensuring its continued innovation and relevance in a rapidly evolving industry landscape.

If you haven't already, please take our Reader Survey! Just 3 questions to help us better understand who is reading Telecom Ramblings so we can serve you better!

Categories: Artificial Intelligence · Energy · Industry Viewpoint

Discuss this Post


Leave a Comment

You may Log In to post a comment, or fill in the form to post anonymously.





  • Ramblings’ Jobs

    Post a Job - Just $99/30days
  • Event Calendar