Tecogen Chillers for Artificial Intelligence (AI) Data Centers
We believe that it is only a matter of time before Tecogen gains traction in the data center cooling market. For the last 20 years, utilities in the US have had excess power. Now, as Artificial Intelligence (AI) data centers are racing to expand, there is a power shortage.
A large data center developer told us they were buying 80 electric chillers every 6 weeks (700 chillers a year). If this developer used Tecogen's chillers, they could have 230 Megawatts of additional power for AI chips.
This is because cooling systems are designed for the worst case - hottest day and full AI load. As a result, 30% or more of a data center's power is allocated to electric chillers. This means less power for AI chips.
Tecogen's chillers run on natural gas. If a data center uses natural gas for cooling, there will be 30% more power for AI chips, which will directly increase revenue and profitability.
Tecochill is the ace in the hole for data center owners.
The following summarizes answers to some of the frequently asked questions regarding Tecogen's chillers in AI data centers including:
•Why use Tecogen's chillers in a data center?
•How many Tecogen chillers per data center?
•How do we plan to close bigger projects?
•Why are our chillers unique?
Why use Tecogen's chillers in a data center?
Every 100MW data center could be a 130MW data center if they use Tecogen chillers.
Case 1 - Utility Power + Tecochill
In many cases, the data center has some utility power but could always use more. Limited power means limited revenue.
The data center owner has three options:
1.Make the data center smaller but this means lower revenue.
2.Build a small power plant and buy an electric chiller.
3.Buy Tecochill - no power plant needed so the payback is compelling.
Tecochill is also simple to install. We have seen multiple chiller projects installed in 6 months or less, while a similar sized power project can take a year or more.
Case 2 - Off-grid + Tecochill
In other cases, the data center is off-grid and has its own power plant. These are usually projects greater than 100MW.
In AI data centers, the chips can spike to full power multiple times a minute. This means cooling also needs to keep up.
Therefore, these data centers use a mix of chillers.
•Baseload chillers run at steady load, sometimes driven by steam from the power plant.
•Peaking chillers, usually electric, ramp up and down to handle spikes in cooling.
These peaking electric chillers rob the data center of available power for IT. Using Tecochill natural gas chillers means more power for IT.
If the power plant uses gas turbines, cooling the air entering the turbine with a Tecochill can also increase power plant efficiency by 15% or more.
Case 3 - Retrofit of existing data centers
Tecogen's chillers can also be used to retrofit existing data centers. As the race to add new data centers heats up, upgrading existing data centers is attractive as it reduces construction time. However, many existing data centers were originally built for cloud storage so they may not have sufficient power for the latest AI chips. Tecochill can be used to free up power, allowing for a successful retrofit.
How many Tecogen chillers per data center?
A 100 MW data center using AI chips needs 30,000 tons of cooling - which would require 100 Tecochill dual power source chillers or 60 Tecochill DTx chillers.
According to CBRE's H1 2025 data center report, there is presently 5,242 Megawatts of AI data center capacity in construction.
How we plan to close bigger projects?
We believe the barrier to sales today is perceived risk. The decision-maker needs to feel comfortable buying our natural gas chillers instead of buying electric chillers from larger manufacturers.
Addressing perceived technology risk
Our Vertiv partnership has given us instant credibility with potential customers.
The next step is a proof-of-concept project. We started working on small projects last year. We expect these to close in H2 2025. The first LOI has already come through and will hopefully become a PO later this year.
We have also quoted other projects such as modular data centers. In these applications, the customer can assess our chillers with limited risk.
Then we can secure a bigger project such as a 50MW or a 100MW data center.
We see new leads every week. Many have compelling reasons to choose Tecochill over the other options.
Faster schedules to secure orders
Data center owners choose vendors who cut construction time.
This is why we designed the dual power source chiller as a self-contained unit. It can be installed on a roof or outside a data center with minimal piping.
This helps the developer cut design and on-site construction time.
To cut delivery time and improve factory output we are:
•negotiating with contract manufacturers to build sheet metal assemblies to reduce time on our factory floor, and
•adding test cells and more assembly areas to increase the number of chillers we can build per year.
Why are our chillers unique?
If Tecochill can gain traction in the market, we will have a long runway of projects because Tecochill is hard to replicate.
Refined in grueling conditions
•20 million hours of chiller operation.
•Many critical cooling sites with no backup.
•One chiller has run 145,000 hours as the only source of cooling at an ice rink in NY.
•Experience in stringent emissions zones such as CA where our equipment can be tested at any time without warning.
Patented technologies
Our engine emissions are comparable to fuel cells with our patented Ultera system. Our hybrid chiller uses patented controls to manage power between engine and grid.
A support system and 30 years of engine expertise
But technology alone is not enough. Many engines run 24/7, equivalent to half a million miles a year. To do this reliably requires a complex support system: trained service technicians, a robust supply chain, inventory of parts and 30 years of in-house engine expertise. Therefore, competitors will find it difficult to replicate what we do.