Bloom Energy Corporation

04/16/2025 | Press release | Distributed by Public on 04/17/2025 04:32

How Are Data Centers and the Energy Sector Powering AI

Power By the Numbers

How Much Power Does a Data Center Use?

Data centers already account for an estimated 1-2% of global electricity use, and demand is rising fast. In the U.S., power consumption is projected to grow by 83 terawatt-hours (TWh) in 2025, the equivalent of powering 7.7 million homes​. Data centers are a key driver of this increase, with AI accelerating the trend. By 2030, AI data center power consumption could reach 8-12% of total U.S. electricity demand, up from 3-4% today​.

How Much Power Does AI Use?

AI workloads, particularly those for training large models, are pushing power needs even higher. Running a single large AI model can consume as much energy as 120 U.S. homes use in a year. High-performance computing (HPC) clusters, often used for AI applications, require significantly more power than traditional data center infrastructure. The average power density per rack is expected to rise from 20 kW to 50 kW by 2027, putting more strain on cooling systems and the already burdened utility grid​.

The challenge isn't just about generating enough power, it's also about delivering it. The U.S. is facing transmission bottlenecks, with grid interconnection delays slowing deployment. Over the next five years, 55 GW of new data center IT capacity is expected to come online in the U.S.-ten times the average power capacity of New York City​. As AI adoption grows, the data center industry must rethink how to scale power infrastructure while improving efficiency.

Strategies for Powering AI

As AI adoption accelerates, traditional power grids are struggling to keep pace with rising demand. New data centers often face long wait times for grid connections, delaying deployment and increasing costs. To stay ahead, more operators are turning to on-site power generation, which provides faster access to electricity, greater reliability, and improved sustainability.

On-site power reduces reliance on overburdened utility grids, which can take years to expand capacity. This approach is gaining traction-according to the 2025 Data Center Power Report, 30% of all data center sites are expected to use on-site power by 2030, up from just 13% in early 2024​. The shift is happening rapidly: in 2024 alone, more on-site power projects were announced than in the previous four years combined, totaling 8.7 GW globally, with 4.8 GW expected before 2030​.

1. Microgrids

As the utility grid struggles to keep up with the pace of deployment, AI requires power solutions that deliver reliability, scalability, and sustainability. On-site systems reduce the reliance on utility grids, which are often constrained by long lead times for capacity expansion. This speed to power is critical in scaling AI data centers without waiting years for grid upgrades. For companies deploying generative AI models or high-performance computing, delays in power availability can translate into missed opportunities and higher costs.

Microgrids enhance on-site power generation by adding flexibility and resilience. Microgrids enable localized, reliable power by integrating distributed energy resources with battery storage and smart controls. They also ensure AI-ready data centers remain operational during grid outages, protecting mission-critical AI applications from disruptions.

2. Smarter Cooling Systems

Cooling accounts for 30-40% of total data center power use. This figure rises significantly in AI facilities where dense racks generate more heat. Traditional air cooling systems cannot keep pace with these demands, leading to inefficiencies and higher energy use.

Liquid cooling systems offer a solution. By delivering coolant directly to servers, these systems reduce energy consumption by up to 30% compared to air cooling. Combined with renewable energy sources, liquid cooling helps improve data center energy efficiency while meeting the high demands of AI workloads.

3. Renewable Energy and Hybrid Systems

Renewable energy is becoming a critical component of data center power strategies. Many organizations now co-locate their data centers near solar farms, wind farms, or hydropower plants to integrate renewable electricity directly. However, the intermittency of renewables cannot address the reliability and rapid time-to-power deployments that AI applications require.

Hybrid systems that combine renewables with on-site generation and storage provide a hybrid solution. This approach reduces emissions while meeting the scale and speed demands of AI workloads.

Reliable, Efficient Energy for Powering AI Data Centers

Bloom Energy offers on-site power generation solutions specifically designed to meet the growing demands of AI data center infrastructure. As AI drives higher data center power requirements, ensuring reliable, scalable, and sustainable power will be essential.

Bloom's solid oxide fuel cells generate electricity directly on-site, reducing dependence on the grid and minimizing exposure to power interruptions. These fuel cells are highly efficient and flexible, capable of running on natural gas, hydrogen, or biogas. This fuel versatility enables data centers to reduce their carbon footprint while maintaining the reliability needed to support high-performance computing (HPC) and AI workloads.

Unlike grid-sourced electricity, Bloom's solution produces fewer emissions and avoids the inefficiencies associated with power transmission and distribution. By decentralizing energy production, Bloom is empowering AI data centers to achieve energy efficiency while keeping pace with rapid infrastructure expansion.

Bloom's on-site Energy Server systems ensure consistent, resilient energy delivery. Combined with the ability to integrate with renewable energy sources, Bloom's technology provides a forward-looking solution for data centers striving to manage growth with sustainability.

Watch Bloom's Vice President of Global Data Centers, Jeff Barber, discuss fuel cells, microgrids, and evolving energy solutions in this podcast.

Balancing AI Innovation with Sustainable Energy Solutions

The rapid rise of generative AI models and AI workloads brings both opportunities and responsibilities. While these technologies promise significant advancements in areas like automation, healthcare, and IT, they also present pressing challenges in data center energy efficiency, power availability, and sustainability. Bloom plays a critical role in addressing these challenges, ensuring reliable, scalable, and sustainable power solutions for AI applications.

With the demand for AI data center power growing with no signs of slowing down, forward-thinking strategies will be essential to balancing growth with sustainability. Solutions that prioritize efficiency, reliability, and renewable integration will define the future of AI infrastructure.

Powering AI FAQs

  1. Why are data center power requirements increasing?
    The rise of high-performance computing (HPC) and AI workloads demands more computational power per server rack, far surpassing traditional IT systems. This increase stresses data center infrastructure, cooling systems, and the utility grid, requiring innovative solutions.
  2. How can AI data centers improve energy efficiency?
    To address rising data center energy efficiency challenges, data center operators are implementing more innovative cooling systems such as liquid cooling, integrating renewable energy sources, and adopting on-site power solutions. These strategies ensure reliable power while reducing emissions and grid dependency.
  3. What role does on-site power generation play in powering AI?
    On-site power generation provides reliable and scalable electricity directly at AI data centers. By reducing dependence on the grid, these solutions improve speed to power and support the high energy demands of AI workloads while enabling greater data center energy efficiency and sustainability.