202603-next-gen-ai-power-manager

202603-next-gen-ai-power-manager

Training large AI models consumes massive power—who will be the next-gen AI “power manager”?

Integration of Multiple Technologies Heralds a New Era in Smart Healthcare

Gartner’s analysis of the top 10 strategic technology trends for 2026 reveals a fundamental shift in the arena of technology, signaling a clear message to all technology decision-makers: We are entering a highly interconnected world radically reshaped by artificial intelligence (AI). In this world, computing power is the new oil, and the power supply and heat dissipation systems are the critical lifelines that prevent this oil field from running dry and bursting into flames.

According to Stanford’s 2023 AI Index Report, the AI large language model GPT-3 consumes 1,287 megawatt-hours of electricity per training session, an amount equivalent to the total power consumption needed to power 3,000 Teslas for 200,000 miles each. At the same time, the training process also emits 552 metric tons of carbon dioxide. Such massive energy consumption not only causes high electricity expenses and potential carbon tax costs for companies, increasing operational burdens, but also intensifies carbon emission pressures at the societal level. In this battle for breakthroughs in energy efficiency and cost, the answer actually lies in the overlooked technological innovations and architectural upgrades of power supply systems.

The “deadly crossroads” of performance and power consumption: Why has the power supply become AI’s bottleneck?

The race for computing power in AI chips has reached fever pitch, but the extrapolated line of Moore’s Law is revealing an unsettling trajectory: Every leap in performance is accompanied by a steeper increase in power consumption.

We are close to hitting the “power wall”—simply piling up transistors no longer yields proportional profit but instead turns data centers into “power hogs.” This problem is not just a cost issue but also a physical bottleneck for scaling up.

Liquid cooling technology has become mandatory rather than optional, which is a direct manifestation of this silent crisis. However, aside from heat dissipation, another more pressing and fundamental challenge lies in the power supply. An inefficient or unstable power supply unit (PSU) can significantly degrade the performance of a multi-million-dollar AI computing cluster or even bring it down instantly.

Decoding the next-generation data center: A paradigm shift from “power supply” to “smart power”

In future AI data centers, power systems will no longer be relegated to a minor role but rather constitute the core smart infrastructure. This shift requires achieving three major transitions:

From centralized to distributed: Traditional centralized power supply is like flood irrigation—it results in significant transmission line losses and cannot satisfy the requirements of rack-level, fine-grained power management. Distributed, modular power supply at the rack level and even the chip level is the future trend, enabling direct power delivery to computing units in a more precise and efficient manner.

From general-purpose to specialized: AI workloads have an extremely wide dynamic range; jumps from idle to full load can potentially occur in just milliseconds. Using a general-purpose PSU is like asking a sedan to tow a truck—it’s simply out of its league. An AI-specialized PSU must handle extremely high peak loads with ultra-fast dynamic response, like a professional race car engine for data centers.

From passive to active: The smart power supply will feature in-depth integration with AI algorithms to achieve predictive power management. By analyzing historical data on workloads, a smart power supply system can schedule and allocate power resources in advance, replacing passive responses with active optimization. Consequently, the system can maximize energy efficiency while ensuring performance.

The battle for power density: How 12kW is redefining the ceiling for PSUs

In AI data centers, space is money and efficiency is survival. Increasing power density—that is, delivering greater and more stable power output within a smaller form factor—has become the core focus of technological breakthroughs.

Behind these breakthroughs lies an all-out race involving materials science, semiconductor technology, and topology design. A notable example is the reference design of the 12 kW AI Cloud PSU by onsemi, a partner of Avnet. This industry-leading solution represents a significant breakthrough.

This solution adopts advanced semiconductor materials such as silicon carbide (SiC), which can withstand higher voltages, temperatures, and switching frequencies. With these materials, the solution pushes power conversion efficiency to new heights and significantly reduces energy waste during the conversion process.

Who Will Be the Next-Gen AI “Power Manager”?

Its 12 kW rated power is approximately 50% higher than the current mainstream solutions on the market, meaning that a single rack can support a more intensive deployment of computing power. More importantly, while increasing power density, it maintains excellent conversion efficiency (with rates typically exceeding 96%), directly reducing substantial electricity costs and cooling burdens.

This solution is not just another product iteration but also sets a new performance benchmark for the power supply architecture of future ultra-large-scale AI data centers.

From supplying components to enabling ecosystems

Single instances of hardware innovation have become insufficient to address the extreme challenges in electricity posed by AI computing power. The key solution lies in building end-to-end capabilities covering everything from the power grid to the chip, which requires technological solution providers to transition from being component suppliers to ecosystem enablers. As a leading global technology distributor and solutions provider, Avnet helps customers greatly shorten product launch cycles with its highly mature reference designs. Moreover, Avnet provides assurance of supply, visibility, and agility through a comprehensive portfolio of services and solutions, meeting customers’ needs for differentiation precisely. These services allow customers to focus more on AI business and algorithm innovation without reinventing low-level power engineering technologies.

In 2026, technology leaders must understand this well: The smartest future must be built upon the most solid energy foundations.

 

 

 

 

 

Helpful Links

202603-next-gen-ai-power-manager

202603-next-gen-ai-power-manager

Related Articles
Integration of Multiple Technologies Heralds a New Era in Smart Healthcare
Cloud collaboration + AI: building sustainable development for smart agriculture
December 1, 2025
Traditional farming has relied on wisdom passed down through the generations. Today, a profound revolution is underway. “Farming by data” is transforming agriculture from labor-intensive toil into smart, decision-driven cultivation.
Industrial automation using analog sensor data for AI systems
Optimized analog front-end design for edge AI
By Avnet Staff   -   November 21, 2025
Connected applications are in transition, from the internet of things (IoT) to the artificial intelligent internet of things (AIoT). Real-world data is now doing double-duty. How does that impact analog design?

202603-next-gen-ai-power-manager

Related Events
Avnet/ Renesas AI webinar
Date: July 15, 2020
Location: Webinar