Nvidia's CEO believes that millions of AI Gpus will reduce power consumption and not increase power consumption

Simulation
Nvidia's CEO believes that millions of AI Gpus will reduce power consumption and not increase power consumption

We can discuss the benefits of AI. 1 thing we all are sure of, however, is that a large server farm full of hundreds or thousands of high-end AI Gpus consumes hundreds of watts each and absorbs a lot of power. That's right

not Jensen Huang, Nvidia's CEO with the tagline "The more you buy, the more you save" in relation to his company's stratospheric expensive AI chips. Perhaps inevitably, Huang has a similar take on power consumption related to the latest AI models, which mostly run on Nvidia hardware.

Speaking in the Q&A following his Computex keynote, Huang's point is, first of all, that Nvidia's Gpus do their calculations much faster and more efficiently than any other Gpu. As he says, you want to "accelerate everything." It saves you money, but it also saves you time and power.

Next, he distinguished the training of AI models from their reasoning, and how the latter provided a dramatically more efficient way to perform certain computational tasks.

"Generative AI is not about training," he says. The goal is not training, the goal is reasoning. When you infer, the amount of energy used is much lower compared to another way of doing computing. For example, I showed you a climate simulation of Taiwan — 3,000 times less power. No more than 30%, no more than 3,000 times. This happens in one application after another."

Huang also pointed out that AI training can be done anywhere and does not have to be geographically close. To put it his way, ""AI does not care where it goes to school.""

"The world does not have enough power near the population. But there is a lot of excess energy in the world. Only the amount of energy coming from the Sun is incredible. But it is in the wrong place. You need to set up a power plant or data center in a non-populated place and train the model somewhere else. Everyone who moves the model for reasoning is close to the people, in the pocket of mobile phones, PCs, data centers, cloud data centers," Huang says.

You can see his point, but still. Let's say this, Huang's take is somewhat selective. Currently, it's hard to pinpoint exactly how much energy is being used to train and infer different AI models. The World Economic Forum recently reported that AI-related electricity consumption is increasing at a rate of 26% to 36% per year, the same as electricity consumption in a small country like Iceland by 2028.

Now, Huang's argument is that this will be offset or offset by reduced consumption elsewhere. But that's only true for what would otherwise be done without AI. The problem is that it is often difficult to tell the difference.

Whether you use AI to create code, create a business plan, or create content, using AI to distinguish the impact it has on the production process has made it possible to edit these photos much faster and more efficiently. Or maybe you just added an extra layer of polish to your video production at the end.

Or maybe you're just doing something with AI that was completely impossible without it, and every joule of energy you use is the net profit of the AI energy Use ledger. Similarly, with AI, you can do more because you can do things faster. Each task is more efficient, but it proceeds to the next task much faster and uses much more energy as a result.

Huang's comments about the availability of energy are also quite plausible. Sure, there may be more than enough solar power to do everything we want, many times in a completely sustainable way. But in fact, it's mostly not where our power is currently coming from.  

Like his favorite, there is a strong element of "the more you buy, the more you save" irony, obviously, that "he will say it." But broadly speaking, it's hard to dismiss the idea that there's a problem with AI's energy footprint. Whatever Huang claims, it's just as hard to imagine that AI will soon push down net energy usage.

The day may come. Ironically, with the help of AI, we may be able to design ways to do more efficiently what we get there. Who knows, maybe AI will help us solve how to make fusion energy production truly practical. But for now, just as AI startups don't feel like saving money when they have to payNvidia1 million for one of Nvidia's AI Gpus, squaring competition to build all of the massive AI farms with claims to reduce power consumption is not easy.

.

Categories