Discussion Nvidia Blackwell in Q4-2024 ?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MoogleW

Member
May 1, 2022
67
29
61
Soo... what do you think the odds of Blackwell hitting high 3's in clocks? I was messing around with a projected spec list and I think it's only going to work if the clocks are mid-high 3's. Either that or they will have to add more CUDA cores/SM, which I doubt since I am guessing that if they are going to make any changes to the SM, it's going to be mainly AI and possibly RT.
The AI stuff is just a small part of the SM and Nvidia has already had 2 gens of vastly improving the tensor cores and RT while improving game performance.

Outside of tensor core throughput, other AI improvements will benefit gaming as well because they focus on increased compute and increased memory performance. Like Ampere rather than very high clocks, I believe they will target 15-25% uplift at the same clock and the same number of SMs

As for clocks, I expect at best 20% higher clocks as long as die size remains relatively constant, so an official 3.2ghz boost clock with actual clocks hitting 3.3-3.4ghz. If the SMs are untouched, but they focus only on those inter SM connections then maybe even higher clocks (not sure), Nvidia simulations will tell them which gets the most performance/area GPU.

So a theoretical 172SM 5090 at 3.2ghz official clocks would land at 141 TFLOPs, 132TFLOPs at 3ghz

The A6000 Blackwell would then be 190 SMs at 2.5ghz so 121 TFLOPs
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
14,914
5,485
136
So this is what I have for a prediction:

GB202 - 12*8*2 = 192 (Titan B, 5090 Ti)
GB203 - 7*8*2 = 112 (5090, 5080)
(AD103 as is, with faster memory)
GB205 - 4*6*2 = 48 (5070, 5070 Ti)
GB206 - 3*4*2 = 24 (5060)
GB205 - 3*3*2 = 18 (Mobile only hopefully. I suspect this will be 96-bit and there's no guarantee 3 GB chips will be cheap enough by then)
 

jpiniero

Lifer
Oct 1, 2010
14,914
5,485
136
Micron's roadmap says that GDDR7 should become available in the first half next year. So it seems plausible for Blackwell gaming cards next year but we're taking the endish.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,107
136
So this is what I have for a prediction:

GB202 - 12*8*2 = 192 (Titan B, 5090 Ti)
GB203 - 7*8*2 = 112 (5090, 5080)
(AD103 as is, with faster memory)
GB205 - 4*6*2 = 48 (5070, 5070 Ti)
GB206 - 3*4*2 = 24 (5060)
GB205 - 3*3*2 = 18 (Mobile only hopefully. I suspect this will be 96-bit and there's no guarantee 3 GB chips will be cheap enough by then)
Still pretty early for guesses as their will be a new architecture. The balance of Steaming processors (cuda), RT 'cores' and Tensor/AI units is unknown. Anyway, 2024 is a plausible time frame; which makes me wonder why some say Blackwell won't be out till 2025 (I assume early 2025).
 

jpiniero

Lifer
Oct 1, 2010
14,914
5,485
136
The balance of Steaming processors (cuda), RT 'cores' and Tensor/AI units is unknown.

I still think it's going to be AI, AI and more AI. With a chance of some RT improvement. And I think GB202 is going to be mainly intended to be a GDDR7 AI Server part and priced like it.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,107
136
Well, that's where the profits are right now.
I still think it's going to be AI, AI and more AI. With a chance of some RT improvement. And I think GB202 is going to be mainly intended to be a GDDR7 AI Server part and priced like it.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,441
2,927
136
I also made a table of specs.
GDDR7 24gbit modules were used except for GB206/207, which uses 24gbit GDDR6.
GB202GB203GB204/5GB206GB207
SM192128805436
TPC9664402718
GPC12
(8 TPC/GPC)
8
(8 TPC/GPC)
5
(8 TPC/GPC)
3
(9 TCP/GPC)
2
(9 TCP/GPC)
Cuda24,57616,38410,2406,9124,608
TMU768512320216144
ROP2561601127248
L2 Cache12896644832
Memory width448-bit320-bit256-bit192-bit128-bit
Memory speed32gbps31gbps27gbps24gbps24gbps
Bandwidth1792 GB/s1240 GB/s864 GB/s576 GB/s384 GB/s
Vram42 GB30 GB24 GB18 GB12 GB

Finally, enough Vram. :D

P.S. AD107 was used as baseline. GB207 is 1.5x of AD107 except the memory subsystem + L2.
 
Last edited:
  • Like
Reactions: Mopetar and Tlh97

jpiniero

Lifer
Oct 1, 2010
14,914
5,485
136
See the reason I am expecting GB205/206/207 to have a huge decrease in SMs is because I am expecting NV to cut die sizes of those to make up for the big increase in costs with N3E and GDDR7. And with no cache scaling, a lot of the die isn't getting smaller.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,441
2,927
136
See the reason I am expecting GB205/206/207 to have a huge decrease in SMs is because I am expecting NV to cut die sizes of those to make up for the big increase in costs with N3E and GDDR7. And with no cache scaling, a lot of the die isn't getting smaller.
And how do you plan to compensate for It?
18SM GB207 vs 24SM AD107 -> +33% (-25%)
24SM GB206 vs 36SM AD106 -> +50% (-33%) [In gaming: 24-27% between AD106 vs AD107]
48SM GB205 vs 60SM AD104 -> +25% (-20%)

You would need 25-30% higher clockspeed just to be on par with the previous generation, that would mean 3.4-3.6GHz.
Then you also need extra performance on top of that because It's a new generation.
You still need more SM or SMs with more cuda in them.
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
14,914
5,485
136
You would need 25-30% higher clockspeed just to be on par with the previous generation, that would mean 3.4-3.6GHz.

Yeah. That's what I think the target is. That might end up being too high... but I'm sure there will be DLSS4 for them to push regardless.

In any case I would not expect much from GB206 or 7 on desktop.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,441
2,927
136
Yeah. That's what I think the target is. That might end up being too high... but I'm sure there will be DLSS4 for them to push regardless.

In any case I would not expect much from GB206 or 7 on desktop.
I don't think there is anything new that DLSS4 can offer to compel people to buy the next gen with the same performance.

With your specs GB206,207 would at best perform as the previous gen AD106,107.
The same is true for GB205 vs AD104, If It has only 48CU.
The only chips with significant performance gain would be GB202,203.
I know I am nitpicking, but this doesn't make sense.
 

jpiniero

Lifer
Oct 1, 2010
14,914
5,485
136

Seems Mr. Leaker guy is now claiming he made a mistake and GB202 is more likely 384-bit.
 
  • Like
Reactions: Mopetar

Ajay

Lifer
Jan 8, 2001
16,094
8,107
136

Seems Mr. Leaker guy is now claiming he made a mistake and GB202 is more likely 384-bit.
That makes more sense at least.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
With GDDR7 modules coming in either 2GB or 3GB sizes I can easily see them initially offering 24 GB on the top part, with the possibility of 36 GB down the line with a Ti/Titan.

However I am skeptical of them using 3GB modules in the 5080 and lower parts, and with the not-so-good rumors about the competition, VRAM capacity may be the same in those parts.
 

Saylick

Diamond Member
Sep 10, 2012
3,460
7,515
136
With GDDR7 modules coming in either 2GB or 3GB sizes I can easily see them initially offering 24 GB on the top part, with the possibility of 36 GB down the line with a Ti/Titan.

However I am skeptical of them using 3GB modules in the 5080 and lower parts, and with the not-so-good rumors about the competition, VRAM capacity may be the same in those parts.
Knowing that AI training just loves large VRAM capacity (to store larger models), I'm not so sure Nvidia will put out a 36 GB model lest it eats into their other products.
 

CakeMonster

Golden Member
Nov 22, 2012
1,446
585
136
I was about to say, get ready to get shafted by NV on VRAM on the high end too. Well, not really with regards to gaming, but if you're into AI as a hobby, it would have been really nice if the ceiling of 24GB had been lifted after the 3090 and 4090...
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
Knowing that AI training just loves large VRAM capacity (to store larger models), I'm not so sure Nvidia will put out a 36 GB model lest it eats into their other products.
Eh, I bet there is a significant market for xx90 cards who are more interested in them for AI purposes than just gaming. NVIDIA could easily pull the same trick that they did with the announcement of the 4080 16 GB and the "4080 12 GB" which was later rebranded to the 4070 Ti after the backlash.
 

Saylick

Diamond Member
Sep 10, 2012
3,460
7,515
136
Eh, I bet there is a significant market for xx90 cards who are more interested in them for AI purposes than just gaming. NVIDIA could easily pull the same trick that they did with the announcement of the 4080 16 GB and the "4080 12 GB" which was later rebranded to the 4070 Ti after the backlash.
If there is a 36GB model, I fully expect it to be exorbitantly more expensive than what the gaming performance uplift would suggest. Nvidia will definitely make consumers pay AI margins if they know people who buy a certain SKU are doing so for AI reasons.
 

jpiniero

Lifer
Oct 1, 2010
14,914
5,485
136
However I am skeptical of them using 3GB modules in the 5080 and lower parts, and with the not-so-good rumors about the competition, VRAM capacity may be the same in those parts.

Depends on the price difference between the 2 GB and the 3. Could cut the bus width to 'compensate' and/or offer both but charge more for the 3 GB version.

In any case I have a feeling that GB205's bus width is less than the 192-bit that AD104 is. And a decent chance that GB207's is 96-bit.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
If they use 3GB modules I expect them to be restricted to the xx80 and xx90 cards only. So that would mean 36 GB for 5090 (384 bit bus), 24GB for 5080 (256 bit bus), 16 GB for 5070/Ti (256 bit bus) and 12 GB for 5060/Ti (192 bit bus), where the 70 series and below use the 2GB modules.

This is the most optimistic outcome, IF NVIDIA are generous.