Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
I only expect that for the 4090. GDDR6X might go onto the 4070, but I doubt it. And I expect/fear relatively low memory quantities again.

Might depend on how much power the 18 gbps GDDR6 draws versus the 19-21 GDDR6X.

I expect GDDR7 at some point but it's sounding like it won't make it in time for the first products.
 

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
Speaking of the 4070, knowing nVidia the first two products will be from AD103. What I won't say is that it will actually be called the 4080 and 4070. Basically 82 and 68 SMs, same as the 3080 and 3090... just much (much) higher boost clocks. Oh and 450 W TBP.
 

Aapje

Golden Member
Mar 21, 2022
1,376
1,853
106
I expect GDDR7 at some point but it's sounding like it won't make it in time for the first products.

I expect GDDR7 once Samsung gets good yields on their 5nm. Samsung's customers are fleeing to TSMC, so that tells you how soon they expect that too happen.
 

Timmah!

Golden Member
Jul 24, 2010
1,417
630
136
Speaking of the 4070, knowing nVidia the first two products will be from AD103. What I won't say is that it will actually be called the 4080 and 4070. Basically 82 and 68 SMs, same as the 3080 and 3090... just much (much) higher boost clocks. Oh and 450 W TBP.

If 4080 is 84SMs while 4090 chip is rumored 144 (cut to say 136 or 132), then it will be far worse product than 3080 was. Actually even worse than 1080 was relatively to Pascal Titan. I kinda doubt this with the competition they are getting this way.
 

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
If 4080 is 84SMs while 4090 chip is rumored 144 (cut to say 136 or 132), then it will be far worse product than 3080 was. Actually even worse than 1080 was relatively to Pascal Titan. I kinda doubt this with the competition they are getting this way.

There's probably three AD102 products, maybe 4. Like 140-128-96. So AD102 takes on the chiplet models while AD103 takes on the RDNA3 monolithic one.
 

Aapje

Golden Member
Mar 21, 2022
1,376
1,853
106
There's probably three AD102 products, maybe 4. Like 140-128-96. So AD102 takes on the chiplet models while AD103 takes on the RDNA3 monolithic one.

I think that you are very much confused. The LinkedIn leak pretty much confirmed that RDNA 3 will launch with two chiplet models (Navi 31 & 32) and one monolithic model (Navi 33). The rumor mills very consistently predict the entire Ada lineup to be monolithic.
 

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
I think that you are very much confused. The LinkedIn leak pretty much confirmed that RDNA 3 will launch with two chiplet models (Navi 31 & 32) and one monolithic model (Navi 33). The rumor mills very consistently predict the entire Ada lineup to be monolithic.

That's what I am saying. There's a reason nVidia made the SM count 70% more. I won't say it will be enough but that's the intention.
 

Timmah!

Golden Member
Jul 24, 2010
1,417
630
136
I see bunch of first 3090Ti´s being presented (Evga, Colorful) and like all of them with 3,5slot coolers? Do you think this is gonna be standard with 4090s, to cool those 400+W cards? Cause i want to have 2 cards (probably keep current 3090 and replace the other one (2080Ti) with 4090), but cant fit more than 2,5 slot ones into my Define R5. This is worrying.
 

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
I see bunch of first 3090Ti´s being presented (Evga, Colorful) and like all of them with 3,5slot coolers? Do you think this is gonna be standard with 4090s, to cool those 400+W cards? Cause i want to have 2 cards (probably keep current 3090 and replace the other one (2080Ti) with 4090), but cant fit more than 2,5 slot ones into my Define R5. This is worrying.

I'm sure there will be AIO coolers. Yeah having to use beefier coolers is going to drive up the cost too.
 
  • Like
Reactions: Timmah!

Aapje

Golden Member
Mar 21, 2022
1,376
1,853
106
I see bunch of first 3090Ti´s being presented (Evga, Colorful) and like all of them with 3,5slot coolers? Do you think this is gonna be standard with 4090s, to cool those 400+W cards? Cause i want to have 2 cards (probably keep current 3090 and replace the other one (2080Ti) with 4090), but cant fit more than 2,5 slot ones into my Define R5. This is worrying.

Yes, of course. All that wattage needs to be cooled somehow.

I would suggest custom water cooling for your situation. Put all that heat from both GPUs and the CPU into a water loop and then cool it all together with one 360 radiator. Your case supports that (if you take out the 5.25 bays).
 
  • Like
Reactions: Timmah!

Timmah!

Golden Member
Jul 24, 2010
1,417
630
136
Yes, of course. All that wattage needs to be cooled somehow.

I would suggest custom water cooling for your situation. Put all that heat from both GPUs and the CPU into a water loop and then cool it all together with one 360 radiator. Your case supports that (if you take out the 5.25 bays).

I already have 360 radiator mounted up-top, its AlphaCool Eisbaer and it cools the CPU. I am not really that keen on custom loop, mainly cause its more expenses on top of already uber-expensive card (and my budget its not unlimited, this is not the case if you can afford this, you can afford that as well), and then i am not experienced enough to put it together myself (i can replace the GPUs or RAM, add more fans and basic stuff like that, but setting up custom loop, thats completely different thing). I guess i would manage with manual, but i would rather not. Additionally, rest of my HW is like 5 years old at this point - if i am ever doing investment like that, it will be as part of completely new rig.

Anyway, i guess i could get a watercooled version and put the radiator infront - currently have 2x fans in there blowing air in. I suppose it should fit in there - 240mm/280mm radiator that is. But its going to be like 3000 EUROS, i can see that already. If profanities were allowed, now i feel like swearing :)
 

Ajay

Lifer
Jan 8, 2001
15,429
7,849
136
I expect GDDR7 once Samsung gets good yields on their 5nm. Samsung's customers are fleeing to TSMC, so that tells you how soon they expect that too happen.
Don’t confuse Samsung's memory business with their logic division. Samsung is the leading DRAM manufacturer for a reason.
 
  • Like
Reactions: Lodix

Aapje

Golden Member
Mar 21, 2022
1,376
1,853
106
and my budget its not unlimited, this is not the case if you can afford this, you can afford that as well

What is your actual use case? Rendering? Do you actually need that 24 GB of VRAM? Otherwise a 4080 or 7800/7900 might be a better idea. Keep in mind that the rumor mill claims that AMD's next gen is exceptional, so I would definitely wait for that, to see if this is true. There might be great value in their platform if consumers still buy largely based on past performance and AMD makes good on the rumors.

@Ajay

Samsung's memory division might be coming up with great new designs, but if their fabs can't produce those, we still won't get them.

GDDR7 is going to require a shift to a smaller production process. Just like DDR5 memory that is actually better than DDR4 presumably requires the same.
 
  • Like
Reactions: Timmah!

JasonLD

Senior member
Aug 22, 2017
485
445
136
Samsung's memory division might be coming up with great new designs, but if their fabs can't produce those, we still won't get them.

GDDR7 is going to require a shift to a smaller production process. Just like DDR5 memory that is actually better than DDR4 presumably requires the same.

Samsung's memories are based on completely different fabrication technology. Their supposed troubles with their logic process doesn't apply there.
 
  • Like
Reactions: Lodix

Timmah!

Golden Member
Jul 24, 2010
1,417
630
136
What is your actual use case? Rendering? Do you actually need that 24 GB of VRAM? Otherwise a 4080 or 7800/7900 might be a better idea. Keep in mind that the rumor mill claims that AMD's next gen is exceptional, so I would definitely wait for that, to see if this is true. There might be great value in their platform if consumers still buy largely based on past performance and AMD makes good on the rumors.

@Ajay

Samsung's memory division might be coming up with great new designs, but if their fabs can't produce those, we still won't get them.

GDDR7 is going to require a shift to a smaller production process. Just like DDR5 memory that is actually better than DDR4 presumably requires the same.

Yup, i am architect and do my own archviz. Those 24GB are a must. I would certainly not pay like 2x the price over the x80 card just cause of like 15 percent more cuda cores.
AMD is of no consequence for me, sadly, as i use Octane render, which on Windows is strictly Nvidia.
 

Frenetic Pony

Senior member
May 1, 2012
218
179
116
Speaking of the 4070, knowing nVidia the first two products will be from AD103. What I won't say is that it will actually be called the 4080 and 4070. Basically 82 and 68 SMs, same as the 3080 and 3090... just much (much) higher boost clocks. Oh and 450 W TBP.

I want higher boost clocks, it'd be really helpful if the rumors pan out to these specs. But I'm not yet certain Ada is really built for that. Hypothetically larger cache sizes help with that, reducing latency hiding needs. But there's always other bottlenecks somewhere.

And it seems like Nvidia may still be an "AI first" company, with consumer GPUs being a slightly secondary concern. It'd be nice to see them go for a much more segment focused division like AMD and Intel are doing. Heck AMD has entirely separate arch between HPC and Consumer. Still if the price is right for these cards then the price is right, even without super high clocks.

Yup, i am architect and do my own archviz. Those 24GB are a must. I would certainly not pay like 2x the price over the x80 card just cause of like 15 percent more cuda cores.
AMD is of no consequence for me, sadly, as i use Octane render, which on Windows is strictly Nvidia.

Could always look into UE? But otherwise Nvidia definitely has that all encompassing software support right now. Would be cool to see AMD use their newfound wealth to try and match that...
 

Timmah!

Golden Member
Jul 24, 2010
1,417
630
136
I want higher boost clocks, it'd be really helpful if the rumors pan out to these specs. But I'm not yet certain Ada is really built for that. Hypothetically larger cache sizes help with that, reducing latency hiding needs. But there's always other bottlenecks somewhere.

And it seems like Nvidia may still be an "AI first" company, with consumer GPUs being a slightly secondary concern. It'd be nice to see them go for a much more segment focused division like AMD and Intel are doing. Heck AMD has entirely separate arch between HPC and Consumer. Still if the price is right for these cards then the price is right, even without super high clocks.



Could always look into UE? But otherwise Nvidia definitely has that all encompassing software support right now. Would be cool to see AMD use their newfound wealth to try and match that...

UE is kind of a dream, to make my stuff interactive. It is however rather difficult to get into. I mean, i found quite a nice online course for more than acceptable price of about 200 EUROs to teach me how to do it, and i presume i would be able to learn it. But its 30 hours of videos from the start - preparation of the scene in modeling software, through all the exporting and importing, light-baking and setting-up within the UE - not to mention i might need to buy some additional assets.... bottom line, too much work to learn something, while cool, i dont really have a clientelle willing to pay for, where i live. Not to mention i would have to do it in my free time and as is, i would rather spend it in different way than with more work.

On topic of AMD, while they were always more than decent competition to Nvidia hardware-wise, aside those few Polaris years, their negligence, when it comes to software side, pretty much locked me to Nvidia since 2009, when i started to use Octane. Its 2022 and this did not change, and believe me, (as example) i would happily pay less for 6900XT than 3090 (presuming their similar performance as it the case with games), if only did it function with Octane. Funny thing, Octane works on Mac with AMD cards, but not because of AMD, but Apple and their Metal API. Since there is nobody like Apple to sort AMD drivers or whatever is needed on the WIndows platform, and AMD is unwilling or not capable to do it themselves, Octane remains Nvidia-locked.
 
  • Like
Reactions: xpea

pj-

Senior member
May 5, 2015
481
249
116
Those power numbers are bananas. I run my 3090 at 300w-330w and even that seems to be close to the limit of what my 2x360 rad loop is happy with
 

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
Those power numbers are bananas. I run my 3090 at 300w-330w and even that seems to be close to the limit of what my 2x360 rad loop is happy with

I am not totally convinced about AD104 being 400 W but I do think that 450-600+ is going to be what AD102 ends up being.
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
They're going to need to start including a substation in addition to an e-leash at this point.

I don't fault them for giving us the option, but I'm not sure if the cooling tech is going to be able to keep pace with the demands.
 

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
They're going to need to start including a substation in addition to an e-leash at this point.

I don't fault them for giving us the option, but I'm not sure if the cooling tech is going to be able to keep pace with the demands.

For AD102 I think it's okay given what they are going to charge.