Powercolor's Edge AI aims to significantly reduce GPU power consumption without a significant hit to frame rates

Action
Powercolor's Edge AI aims to significantly reduce GPU power consumption without a significant hit to frame rates

Many PC enthusiasts dislike the fact that top-of-the-line top-end GPUs consume large amounts of power and are experimenting with all manner of ways to reduce power consumption, including undervoltage, frame rate caps, and lowering the maximum power limit. Graphics card vendor PowerColor is experimenting with a slightly different approach with a system called Edge AI, which uses NPUs to manage power usage in games without affecting performance.

PowerColor's Computex booth was showing a demonstration of their work in progress. While we didn't get a chance to see the system in action (there was a great deal to try and see at the event), tech site IT Home and X user Harukaze5719 were able to obtain some photos of the setup and two computers running Final Fantasy XV They managed to see it running on two computers running Final Fantasy XV.

PowerColor engineers hotwired an external NPU to an AMD graphics card and programmed one of the computers to manage the GPU's power consumption during rendering. At the moment, it is unclear exactly what is going on behind the scenes, but the NPU (Neural Processing Unit) is a specialized processor for handling the mathematical operations involved in AI routines.

My guess is that the NPU is running the neural network, taking in indicators such as GPU load, voltage, temperature, and aspects from the game being rendered and modifying the GP voltage so that on average the power consumption is significantly reduced.

In the Final Fantasy XV demo, a PC without the NPU ran the game at 118 fps and the graphics card consumed 338 W to achieve this; the other setup with the NPU and GPU combined consumed 261 W and 107 fps This is a 9% reduction in frame rate. This represents a 23% reduction in power consumption for a 9% reduction in frame rate.

PowerColor's demo stand claims that the Edge AI will improve performance, clearly forgetting that if you are introducing a new technology, you want to make sure it does what it says it will do before you wave it around in public. But even discounting that minor marketing blunder, the whole concept of Edge AI appears to have considerable potential.

Reducing graphics card power consumption has multiple benefits, including less heat dissipation into the game room, lower overall PC power consumption, and longer-lasting peripheral components on the graphics card. All of these would be worth the relatively small reduction in performance.

At present, the Edge AI requires external NPUs wired at various points on the graphics card, but internal NPUs could be used to do the same thing, if only to monitor voltage and temperature. Most of these are built into laptop-based chips, such as AMD's new Ryzen AI series, Intel's Core Ultra series, and Qualcomm's Snapdragon X processors, and are rarely used with discrete graphics cards There is no such thing.

The neural networks that Edge AI runs could theoretically be done on a GPU, but GPUs, unlike NPUs, are not designed to do so with as little power as possible.

The 77W reduction in GPU power seen in the Final Fantasy demo would likely be much smaller (and the fps drop would likely be greater) if the routines were GPU accelerated.

I do not believe that PowerColor plans to release graphics cards with NPUs on the circuit board. Instead, they are probably preparing the Edge AI for when NPUs are routinely integrated into desktop CPUs from various vendors. If so, it is one of the few uses for AI that I genuinely look forward to seeing in action.

And if PowerColor succeeds, no doubt all the other graphics card manufacturers will want to replicate it. As long as all of these systems are available on a voluntary basis, it will be a positive step forward for the entire GPU industry.

Categories