GeForce RTX 4000 “Lovelace“ power draw might exceed 800 W

Nvidia GPU power consumption might go up even more than expected

GPU TDPs have become a bit unhinged lately. Having breached the 300W barrier and then jumping up to 320-350 W, Nvidia is now even developing a 450 W card in the GeForce RTX 3090 Ti. However this might be nothing in comparison with what is about to come. It seems that not even the scary 600W number that we heard abotu earlier is the final destination. Nvidia’s next-gen is about to set the bar even higher (and not in a good way).

News regarding the upcoming graphic card generation has emerged from the Twitter-dwelling leaker with the nickname Greymon55, who lately has had pretty good sources. He currently has a good and bad news. The good one you might have heard already – according to his information the new GeForce RTX 4000 with Lovelace architecture might arrive as soon as in September. Greymon55 even recommends not buying any graphic cards of the current generation any more at this point, given how close and how superior the new GPU generation is going to be.

More: Next GPU generation Nvidia GeForce with Lovelace 5nm and up to 144SM?

The bad news is that, according to some sources, Nvidia is planning on using a very high TDP. The highest models should reportedly be based on the AD102 chip (the designation is “ADxxx”, because the architecture is named after Ada Lovelace) and Greymon55 states that according to his sources, three models might exist, since there are three different power draw levels being given the chip.

And those TDPs are allegedly planned to be 450 W, 600 W and even 850 W. We have been prepared for the first two after previous leaks that have claimed the possibility of 550W TDP. However a PC GPU withe 850W power draw is something  completely unexpected (such power consumptions almost ranks up with smaller electric kettles).

The mentioned power consumption might belong to the models based on the AD102 chip, e.g. the GeForce RTX 4080 (that according to Greymond55 should be based on this chip) would have 450 W, GeForce RTX 4080 Ti gets to 600 W and finally the GeForce RTX 4090 model would come up to 850 W. According to Greymon55 those are not the final specifications and changes might occur.

Fortunately there is still some hope that this won’t happen. According to another accomplished leaker Kopite7kimi, who has been the single most reliable source these past few years, such data is to be considered as unconfirmed “rumors” for now. Nevertheless he also states to have heard that the TDP could be 450, 600 and 800 W for “80”, “80 Ti” and “90” models. So on one hand such power consumptions are not confirmed yet by the most reliable sources Kopite7kime has access to, on the other hand such numbers are being mentioned somewhere. So the chance that this not exactly ideal scenario will become reals is not equal to zero, unfortunately. But at least there is some hope that this news is something we’ll have to take back in the future… According to Kopite7kimi the original source of this news has about a 50:50 reliability.

According to Greymon55 these brutal TDPs should luckily only be a problem in the higher segments, with the lower performance tiers still sticking to GPUs with a reasonable power consumption. Reportedly those might be the graphic cards like GeForce RTX 4060 and RTX 4070 and Radeon RX 7600 and 7700 XT (provided the naming system does not change), this categories should stay at more manageable power draw levels.

Nvidia GeForce RTX 3090 Ti as shown at CES 2022, said to have a 450 W TDP, which is to be cooled by a standard air cooler in the Founders Edition style, though three-slot thick one (Source: Nvidia)

An unfortunate trend…

However if you would want a graphic card with higher performance, then dealing with a 600 W or 800 W won’t be pleasant. Increased electricity costs might become a very noticeable problem, because we are talking about hundreds of Watt, this is not the old “is 50 or hundred more watts still acceptable?” debate that has been common with with CPUs and GPUs in the past. With a power draw exceeding 600 W, even many PSUs might start to experience problems. There will be many users, whose current PSU capacity is for example just 750W or 850W (keep in mind it hast to take CPU power, conversion losses on VRMs and other components in the PC into account too).
Furthermore, the PSU should not be driven to full capacity, because GPUs as well as CPUs usually don’t have a constant power demand and the current being drawn usually oscillates with peaks that might far exceed the TDP of the GPU. Due to this reason and some others it is always better to have your PSU capacity a bit overprovisioned. Greymon55 has joked that you should prepare a 1500W PSU, which is actually not *that* far fetched.

Last but not least a power consumption of 600 W, not to mention 850 W will be a major challenge to cool, considering these are doubles to triples of wattages that used to be normal for graphic cards in the past. Handling the heat alone would be a challenge, not to speak doing so while maintaining a tolerable noise level.

The competing Radeon cards are unfortunately also said to take part in the power consumption inflation, at least for high end models. Contrary to Nvidia AMD intends on using a chiplet GPU consisting of multiple silicon dies, which might further increase the power consumption due to chip interconnects which become very power hungry at higher bandwidths. However the current rumors seem to lean to the next-gen Radeons (RX 7000?) still having an a bit lower TDPs than GeForce RTX 4000 cards. Nevertheless, there will still be increase above the TDPs of Radeon RX 6000 cards. And even should AMD manage to keep the increase only at 450 W, it still would be a scary shift in power consumption.

It would be ideal if the market simply didn’t accept these excessive power consumption standards and forced the producers to return to more reasonable power limits.

Sources: VideoCardz, Greymon55, Kopite7kimi

English translation and edit by Karol Démuth, original text by Jan Olšan, editor for Cnews.cz


  •  
  •  
  •  
Flattr this!

Leave a Reply

Your email address will not be published. Required fields are marked *