Dell Technologies Inc.

09/16/2025 | Press release | Distributed by Public on 09/16/2025 08:24

The AI Energy Challenge: What CIOs Need to Know

tl;dr: AI's biggest challenge is energy. Training and inference workloads consume massive power, and CIOs must focus on efficiency, flexibility and aligned measurement to maximize performance per watt.

The biggest challenge AI faces right now isn't algorithms or data. It's energy. The more intelligence we demand from machines, the more electricity those machines demand from us. That's the reality I call the AI energy challenge, and it's one that CIOs and business leaders must take seriously.

In a recent AI & Us discussionwith Dr. Jeremy Kepner of MITand Ty Schmitt of Dell Technologies, we unpacked the scope of this challenge and what leaders can do to manage it. Along the way, I even joked that if the lights dimmed at a Boston restaurant, I'd know who to blame: Dr. Kepner's supercomputing center at MIT. But as Schmitt reminded me, maybe those same systems would be quietly improving my dinner through a smarter supply chain.

My point being, this isn't an abstract conversation. The impact of AI power demand is already rippling through daily life.

Training, inference and their appetite for energy

AI runs on deep neural networks, and those networks are hungry. There are two phases to keep in mind:

  • Training, where vast datasets are processed to create models. This is both computationally and energetically expensive.
  • Inference, where those trained models are put to work answering questions, making predictions or generating content. Individually lighter, but collectively massive at scale.

Both phases consume extraordinary amounts of energy. Traditional "large" data centers once ran at 20-50 megawatts. Today, AI training facilities can require hundreds of megawatts. A 100 MW facility alone uses as much electricity as about 80,000 U.S. homes, based on EIA household consumption data. That's a good chunk of power.

It's not only about cooling systems

When the topics of energy and AI come up, people love jumping to solutions. Thinking something like "just add liquid cooling and we'll be fine." Not quite. Cooling matters, but efficiency is about much more than machinery.

The real gains come when internal silos come down. Facilities, IT, real estate and data science teams often optimize in isolation. But when those groups align on performance, resiliency and cost, things get interesting. In fact, "I think there is more rapid innovation in technology in the last two years associated with power and cooling than in the previous 25 years," Schmitt said. That's the kind of acceleration CIOs should pay attention to.

Liquid cooling helps, just not for everyone

Liquid coolingis no longer a fringe technology. Dell, for instance, is already on its fourth generation of liquid-cooled solutions, and it's becoming essential for high-density AI workloads. But not everyone needs it. Some inference-heavy deployments still make sense with air or hybrid approaches.

The real issue is flexibility. As Dr. Kepner noted, customers need standardization in liquid cooling, so investments made today don't become expensive paperweights tomorrow.

"Long gone is this aspect of a 25- or 30-year asset designed to a point and then filled up."

- Ty Schmitt, VP & Fellow, Dell Technologies

Prioritize flexibility instead of a 25-year plan

The old idea of designing a data center as a 25-year asset is gone. AI evolves too quickly. Flexibility is imperative for survival. As Schmitt put it, "Long gone is this aspect of a 25- or 30-year asset designed to a point and then filled up."He added that building systems with flexibility and options "will pay dividends" as workloads evolve:

  • Modular increments(2MW, 5MW, 10MW) let organizations scale with confidence
  • Flexible design lowers both CAPEX and OPEX
  • Training staff in performance engineering makes every run and every watt count

Think of it this way: build for adaptability, not for permanence. What works today may look outdated in five years, or even five months.

Aligning KPIs from load side to line side

Measurement is another challenge that's often overlooked. IT teams may be chasing one set of metrics while facilities is after another. Without alignment, efficiency gains disappear.

The key is to balance load side(workloads, models, applications) and line side(power delivery, cooling). Every watt should be tied to performance, and every dollar tied to efficiency. Or as I like to put it, you can't optimize what you don't measure.

Why this matters for all industries

It's tempting to dismiss all of this as a niche concern for CIOs and data center architects. But these energy-hungry systems aren't just making better chatbots or recommendation engines. They're enabling breakthroughs in medicine, materials scienceand global supply chains.

You can watch the full conversation I had with Ty and Jeremy here.

Dell Technologies Inc. published this content on September 16, 2025, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on September 16, 2025 at 14:24 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]