Discussion Nvidia Blackwell in Q4-2024 ?

Page 17 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MrTeal

Diamond Member
Dec 7, 2003
3,584
1,743
136
Throwing my guess for the 5090 cooler configuration. It's my only guess for how they can maximize the airflow even further
View attachment 99566
That seems a lot more reasonable for a 2 or 3 board design than whatever that image wccf is supposed to represent. Using the 4090 Ti prototype design would be interesting though.

I'm actually quite interested to see what they do here. It should make waterblock designs quite interesting.
 

Saylick

Diamond Member
Sep 10, 2012
3,361
7,050
136
Throwing my guess for the 5090 cooler configuration. It's my only guess for how they can maximize the airflow even further
View attachment 99566
Makes sense. Essentially finding other ways to leverage the through-heatsink approach. 30 and 40 series FE cards already use one through-heatsink fan, so it's natural to try to use two. The issue is that means you have a floating PCB in between the two fans, so you need another PCB for the IO bracket and maybe another for power. Maybe the structure of the card comes from a bracket which ties all the fans and PCBs together.
 
  • Like
Reactions: Mopetar

MoogleW

Member
May 1, 2022
65
29
61
Will it be less power hungry?
Look for approx 450W-520W. Especially if increased clocks and/or increased per SM throughput.
What's interesting is if GB203 is 320 or 350W.

But that would still come on the back of increased efficiency
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,260
5,257
136
IMO there shouldn't be consumer level GPUs above 250W. At least I don't want electric stoves.

So previous AMD generation would stop at 6800. No 6800XT or above. Current generation would stop at 7700XT.

Previous generation NVidia would stop at 3070. No 3070 Ti or above. Current generation would stop at 4070 Super.

That would suck. Don't deny other people more powerful options, just because you don't want them...
 
  • Like
Reactions: DeathReborn

MoogleW

Member
May 1, 2022
65
29
61
IMO there shouldn't be consumer level GPUs above 250W. At least I don't want electric stoves.
People have proven to care more about performance than absolute power draw. Especially if some level of efficiency improvements can be plugged into higher performance than previously possible with older architecture
 
  • Like
Reactions: Tlh97

ToTTenTranz

Member
Feb 4, 2021
103
152
86
People have proven to care more about performance than absolute power draw.

Where were those people when the RX480 performed above the GTX1060 while pricing below?
If that's a majority of buyers, then the reviewers weren't really representing them then.


But somehow, as soon as we got RDNA2 vs. Ampere then all of a sudden power efficiency wasn't that important anymore.
Humm... 🤔
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,260
5,257
136
Where were those people when the RX480 performed above the GTX1060 while pricing below?
If that's a majority of buyers, then the reviewers weren't really representing them then.


But somehow, as soon as we got RDNA2 vs. Ampere then all of a sudden power efficiency wasn't that important anymore.
Humm... 🤔

The people that care more about performance, aren't the ones buying ~$200 cards.

Plus the obvious, that most people just want NVidia cards.
 

tajoh111

Senior member
Mar 28, 2005
304
320
136
Where were those people when the RX480 performed above the GTX1060 while pricing below?
If that's a majority of buyers, then the reviewers weren't really representing them then.


But somehow, as soon as we got RDNA2 vs. Ampere then all of a sudden power efficiency wasn't that important anymore.
Humm... 🤔

Upon release, the RX 480 lost to the GTX 1060.


It wasn't even priced better than the GTX 1060 with the performance gap. Add in the worse power efficiency and it's no wonder the RX 480 was outsold by the GTX 1060.
 
  • Like
Reactions: Curious_Inquirer

jpiniero

Lifer
Oct 1, 2010
14,805
5,429
136
I think at the low end, power consumption is an issue a bit due to needing a more expensive power supply.

The 5080 and 5090 are not low end, lolz.
 

CouncilorIrissa

Senior member
Jul 28, 2023
235
813
96
You can always set power limits
You're still paying for overbuilt VRMs, your-momma-sized cooling setups and anti-sagging brackets (lmao) that the GPU wouldn't otherwise need if it was clocked at a sane level and not at the plateau of the v/f curve.

So previous AMD generation would stop at 6800. No 6800XT or above. Current generation would stop at 7700XT.

Previous generation NVidia would stop at 3070. No 3070 Ti or above. Current generation would stop at 4070 Super.

That would suck. Don't deny other people more powerful options, just because you don't want them...
No, for RDNA3 just drop the clocks slightly below those of 7900GRE (260W TDP GPU, and a poor bin at that).

It's even better for Ada: the laptop 4090, which has the same config as the desktop 4080 can be configured with TDP of up to 150W, meaning there's a 100W headroom to introduce the AD102 instead of AD103 and maybe even raise clocks a little.

Those in need of 350W+ monstrosities should pay for GALAX HOF series or whatever.

v/f curve placement of consumer GPUs has been dumb for a while now.
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,179
3,147
136
www.teamjuchems.com
You're still paying for overbuilt VRMs, your-momma-sized cooling setups and anti-sagging brackets (lmao) that the GPU wouldn't otherwise need if it was clocked at a sane level and not at the plateau of the v/f curve.


No, for RDNA3 just drop the clocks slightly below those of 7900GRE (260W TDP GPU, and a poor bin at that).

It's even better for Ada: the laptop 4090, which has the same config as the desktop 4080 can be configured with TDP of up to 150W, meaning there's a 100W headroom to introduce the AD102 instead of AD103 and maybe even raise clocks a little.

Those in need of 350W+ monstrosities should pay for GALAX HOF series or whatever.

v/f curve placement of consumer GPUs has been dumb for a while now.

Truly, looking at cards from like 10 years ago - the chonkiest of the chonkers - and you're like... wow, these are baby cards now.

GTX 980Ti that I had until recently was a baller EVGA super clocked premium card. LMAO at the 2060S cooler was pretty similar that I had at the same time, never mind the ludicrous 6800 that I shoehorned into a full ATX case that barely fit it (XFX wttf?) and that's without addressing the 3080's, 3090Ti and other cards that I've handled recently.

Remember this widely panned "Hair Drier"!?!? :D NEEDS A FULL MOLEX POWER CABLE OH NOES. I think I had a few cards of that time period that had the floppy drive power connector? lol.

1716924624370.png

I bought a 5900 Ultra that had a much more modern looking cooler on it cheap in ~2005 and it was solid with my OCZ 500W PSU with adjustable pots :D

Many cards sold today could have sub 200W power budgets and still be pretty awesome. We've just gotten to the point where they are still competing with the last generation of cards they (AMD & nvidia at least) overvolted and overclocked for hardly any reason by default and seem to be weird about backing down.
 

jpiniero

Lifer
Oct 1, 2010
14,805
5,429
136
Many cards sold today could have sub 200W power budgets and still be pretty awesome. We've just gotten to the point where they are still competing with the last generation of cards they (AMD & nvidia at least) overvolted and overclocked for hardly any reason by default and seem to be weird about backing down.

Well... the 4090 laptop (which has the same specs as the 4080... minus the clock speed) has about the same performance in Timespy as the 4070 Super. Would you be happy paying $1200+ for that?
 

CouncilorIrissa

Senior member
Jul 28, 2023
235
813
96
Well... the 4090 laptop (which has the same specs as the 4080... minus the clock speed) has about the same performance in Timespy as the 4070 Super. Would you be happy paying $1200+ for that?
The 4070S would be proportionally slower if clocked at a sane level, so relative performance would remain the same. Would also help with not normalising optimising games for GPUs that pull 1/5 of a vacuum cleaner suction power.
 
  • Like
Reactions: Tlh97 and blckgrffn

branch_suggestion

Senior member
Aug 4, 2023
278
599
96
Where were those people when the RX480 performed above the GTX1060 while pricing below?
If that's a majority of buyers, then the reviewers weren't really representing them then.
That comparison slightly favoured NV until VRAM limitations and FineWine kicked in.
RX470/570 vs 1050Ti was always extremely AMD favoured and well...
 

coercitiv

Diamond Member
Jan 24, 2014
6,341
12,598
136
Well... the 4090 laptop (which has the same specs as the 4080... minus the clock speed) has about the same performance in Timespy as the 4070 Super. Would you be happy paying $1200+ for that?
The 4090 laptop is 120W TDP. Increase power to the levels discussed above (200-250W) and let's see how the 4070 Super fares against it. Also, quoting 100-150W mobile products based on the same silicon used in 300W+ desktop cards only goes to show how these chips could easily be tamed under 250W for a rather acceptable performance penalty. (acceptable only to some, as the discussion shows)

Personally I would be happy with factory profiles for both scenarios, performance and optimized power consumption. Some cards have Silent BIOS options, but more often than not they're focused on lowering fan speeds at the expense of temps, power is very close to the perf profile. Power customization can still be done in software, but only a BIOS switch is set and forget.
 
Last edited:

gdansk

Platinum Member
Feb 8, 2011
2,454
3,323
136
I think 250W-300W is a good upper limit.
I ran Steel Nomad on my RTX 4090 at 250W. It scored 7307 which is about 10% above the average for a RTX 4080 Super.
I think people would be okay with that level of performance if the 200W savings in cooler and lower speed memory required were passed on to consumers.

At 200W, however, it drops to 5600. Not great, sub-linear.
At 300W, it is about 8700. Good but slightly less than linear.
 
Last edited:
  • Like
Reactions: blckgrffn

Saylick

Diamond Member
Sep 10, 2012
3,361
7,050
136
Looks like GB202 might be like A100 of sorts with a split L2 on a singe monolithic die. If so, it does suggest that future high end GPUs could be MCM a la B100.
Kopite confirms this:

Also, 2-slot, 2-fan cooler? I understand that if it's got two through-heatsink fans that it will be more efficient, but those fans got to be 140mm+ or something in order to be 2-slots thick and still dissipate 450W.
 
  • Like
Reactions: Mopetar

Mopetar

Diamond Member
Jan 31, 2011
7,979
6,365
136
Although there is a lot of appeal to a slim card, a BFGPU that's not just for show is pretty awesome as well.

Most gamers that buy top end GPUs aren't doing anything with their other PCIe slots, so in the future I want to see something that just plugs in to three of them, with two slots being for stabilization and enabling a truly impressive cooling apparatus.
 

MoogleW

Member
May 1, 2022
65
29
61
The people that care more about performance, aren't the ones buying ~$200 cards.

Plus the obvious, that most people just want NVidia cards.
Those people care most particularly about performance. The most important factor is performance/$