Data centers: The race to power AI - The McKinsey Podcast Recap
Podcast: The McKinsey Podcast
Published: 2025-12-25
Duration: 13 minutes
Guests: Jesse Noffsinger, Pankaj Sachdeva
Summary
The episode explores the growing demand for data centers driven by AI advancements, highlighting the challenges and strategies related to energy consumption and infrastructure development.
What Happened
The episode opens with a discussion on the rapid growth of Gen AI, expected to create $4 trillion in value by 2030, and its dependency on data centers for physical housing and technological operations. Jesse Noffsinger points out that data centers require significantly more power, necessitating innovations in cooling technologies and chip architecture to manage this demand efficiently.
Pankaj Sachdeva notes that the global data center supply currently stands at 70 gigawatts but is projected to reach 220 gigawatts within five years. This massive increase is primarily driven by AI's computational needs, prompting a shift of data centers to locations with available power, such as Wisconsin and Oklahoma, as traditional tier-one markets reach their power capacity.
The conversation further explores how AI's structural and architectural demands are reshaping data centers, with a shift towards liquid cooling to handle the increased heat from power-hungry operations. The transition from training-intensive environments to inference-based applications is also highlighted as a significant trend over the next five years.
Challenges in meeting the growing power needs include infrastructural requirements not only for energy delivery but also for generating electricity. Jesse emphasizes the need for substantial grid infrastructure development to support AI's rising power demands, noting that building new data centers can take up to 24 months, with delays pushing timelines further.
To address these challenges, the episode discusses the importance of investing in clean and high ROI energy projects, such as large-scale wind and solar farms, as well as innovative technologies like geothermal and next-gen nuclear reactors. These innovations are seen as crucial for meeting mid- to long-term energy demands.
The episode concludes with insights on how enterprises, suppliers, and investors should navigate the evolving data center and AI landscape. Pankaj stresses the need for durable business models and flexibility to adapt to shifting demand and supply patterns, as well as the importance of partnerships across the ecosystem to tackle these large-scale challenges.
Key Insights
- The global data center supply is projected to increase from 70 gigawatts to 220 gigawatts within five years, driven by AI's growing computational demands.
- Data centers are increasingly adopting liquid cooling systems to manage the heat generated by AI operations, transitioning from training-intensive to inference-based applications.
- Building new data centers can take up to 24 months, with additional delays possible due to the need for substantial grid infrastructure development to meet AI's power demands.
- Investing in large-scale wind and solar farms, as well as geothermal and next-generation nuclear reactors, is considered necessary to meet mid- to long-term energy demands for data centers.