
ON.energy Unveils AI UPS: The World’s First Medium-Voltage Power System Designed for AI Data Centers
ON.energy™ has announced the launch of AI UPS™, a groundbreaking innovation that marks a new era in data center energy infrastructure. As the world’s first medium-voltage uninterruptible power supply (UPS) purpose-built specifically for AI-driven data centers, AI UPS redefines how next-generation compute campuses connect to and interact with the electrical grid. The system is designed to provide complete facility resilience while maintaining grid stability, setting a new benchmark for energy reliability in the age of artificial intelligence.
The Power Challenge of AI Data Centers
AI data centers represent the most power-hungry and demanding compute environments in history. These facilities, often exceeding hundreds of megawatts, consume more power than small cities and are pushing existing electrical infrastructure to its limits. Traditional UPS systems were never engineered to handle the rapid power fluctuations, fast ramp rates, and voltage transients created by high-intensity GPU workloads.
As AI models grow exponentially in size and complexity, data centers must draw and adjust power faster than conventional systems can accommodate. This mismatch between energy demand and system capability has become a critical threat to uptime and grid reliability. For utilities and developers, the challenge lies in building facilities that are not only energy-intensive but also grid-safe and operationally flexible.
Introducing a Dynamic Power Layer
ON.energy’s AI UPS is designed to act as a dynamic medium-voltage power layer that sits directly between the grid and the compute facility. Unlike conventional UPS systems that only provide backup power, this grid-interactive system actively stabilizes and balances energy flow between the two, ensuring continuous operation even during extreme load events.
Its proprietary architecture scales effortlessly from megawatts to gigawatts, supporting both grid-connected and off-grid environments. The system absorbs rapid power fluctuations caused by GPU-intensive workloads, ensuring seamless performance while safeguarding grid integrity. AI UPS also enhances voltage and frequency ride-through capabilities, which helps accelerate interconnection approvals and shorten deployment timelines for new data centers.
By functioning as both a protective and an enabling layer, ON.energy’s technology bridges the gap between energy stability and computational performance—a combination that traditional power systems have struggled to achieve.
Redefining Grid-Safe Data Center Design
With the AI UPS, ON.energy has set a new industry standard for how large-scale AI infrastructure connects to the grid. The company’s approach ensures that data centers not only maintain uptime resilience but also contribute to grid safety and energy efficiency.
According to Alan Cooper, Co-Founder and CEO of ON.energy, the launch represents a fundamental shift in how AI infrastructure is developed and integrated:
“The global race to build AI training capacity may be the most important infrastructure challenge of our time. Managing the massive, volatile load profiles of GPU clusters without risking grid stability is the bottleneck that defines how quickly these facilities can come online. AI UPS is the new standard for how to interconnect AI data centers to the grid—faster, safer, and more resilient.”
ON.energy’s system directly addresses one of the largest barriers to AI expansion: the difficulty of interconnecting high-load data centers to constrained grid networks. By acting as a buffer between the facility and the utility, AI UPS minimizes disturbance risks and allows new projects to move from permitting to operation much faster than before.
Turning Power Infrastructure into a Revenue Asset
Beyond reliability and resilience, ON.energy’s AI UPS also represents a financial innovation for the data center industry. Traditional UPS systems are passive assets—necessary for uptime but non-revenue generating. In contrast, ON.energy’s grid-interactive design enables operators to participate in energy markets or behind-the-meter optimization programs, transforming a cost center into a revenue-generating asset.
With its ability to store, discharge, and balance power dynamically, the AI UPS can be leveraged for grid services, frequency regulation, and peak shaving, offering data centers a new way to monetize their infrastructure. ON.energy estimates that facilities could earn millions of dollars annually per 100 MW of IT load, depending on market participation and operational strategy.
Ricardo de Azevedo, Co-Founder and CTO of ON.energy, emphasized the transformative nature of this design:
“We engineered AI UPS to set a clear technical standard for what grid-safe data center interconnection looks like. It’s a scalable foundation for the next era of AI infrastructure—one that protects both the data center and the grid.”
Building the Future of AI Power Infrastructure
The debut of ON.energy’s AI UPS comes as the global demand for AI training capacity accelerates at an unprecedented pace. Hyperscale operators and technology companies are investing heavily in GPU clusters, liquid cooling, and power-intensive compute networks—all of which require a new level of electrical resilience.
ON.energy’s system is now being deployed across multiple U.S. hyperscale campuses, where it supports the rapid expansion of AI facilities with medium-voltage resilience and grid-safe architecture. These deployments mark the beginning of a broader shift toward energy-aware compute infrastructure, where data centers actively participate in balancing and stabilizing the grids they depend on.
By delivering both technical performance and economic value, AI UPS positions ON.energy at the forefront of a new class of energy-technology companies enabling the AI revolution. Its introduction highlights a growing understanding across industries: the success of artificial intelligence will depend not only on algorithms and hardware but also on the power systems that sustain them.