Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 83 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

leoneazzurro

Senior member
Jul 26, 2016
881
1,395
136
1. That beefed up AD106 would cost more(extra 25-30mm2 and 4GB Vram), true, but performance would be also better. Nvidia could ask more for It, which will naturally result in higher laptop price, which is bad, but still cheaper than RTX 4080 12GB and more future-proof than with only 8GB Vram.

2. As you said, the space is a problem. If they used HBM It would be resolved, but they keep using GDDR6. With HBM they wouldn't even need big L2 cache and memory PHY would be also smaller. Amount of Vram would also no longer be a problem.

3. Current AD106 is a "big die" paired with only 128-bit bus. It is already bottlenecked.

1. They are maximizing their profits, not doing favors to the users. These dies are made for a market range that is on the lower end, their reasoning is that if you need more FPS, you'll buy the higher end GPUs for bigger notebooks and pay more. Most of the laptop user do not even know a third of the technical details we are discussing here and in any case most people do not want to spend 2K$ or more on a mainstream laptop. Same reason for AMD going to produce N33 on N6 instead of N5 and even getting a smaller chip than the previous generation one. Margins and keeping costs low especially on lower end parts.

2. HBM on a mainstream product? Sure. There must be a reason it does not happen.

3. AD106 is not a big die in absolute terms, it is less than 200mm^2. And as such, even finding the space for the pins of a memory bus larger than 128 bit with a standard packaging is probably difficult.

In any case, the 4050/4060 and the 4080 mobile (especially the latter) seem to be the best deals, they offer better performance and perf/W than the older gen and will probably sold at the same price. 4080 Mobile could be my next GPU, it will depend on what N32 will be.

Edit: it is anyway a bit of scandal because looking at the prices on barebone laptops configurators the price of these mobile cards is quite high, on par with their desktop version while being considerably less powerful and even without the massive cooler of the desktop VGA. But alas, not that there is a choice there (maybe when N32 will arrive but I doubt it).
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,051
596
136
Honestly speaking, I also don't understand the point of releasing AD106 with those specs.
Laptop market isn't like the desktop one in that you sell mostly to large manufacturers and what they want isn't what we want. They will have told Nvidia how much the 40x0 chips at each tier must cost, they will have said it must be a performance upgrade but cost is more important then the size of the upgrade, availability is also key - lack of supply effects manufacturers sales, so there must be enough supply - and power usage is also a high priority. Nvidia has delivered exactly what they want. Going down a chip size makes it easier to keep up with demand and it lowers power requirements, yet it's still a performance upgrade and it's got new features which is what makes sales. The manufacturers will be happy.
 

TESKATLIPOKA

Golden Member
May 1, 2020
1,450
1,729
106
36 is 50% more than 24 SMs. That's a fairly substantial step.
Die size is only 19% bigger and 50% more SM results in only 20-25% better performance, so It has a bottleneck, but lower clock is also at fault here.
With only 128-bit 8GB It's not a very appealing product.
 

TESKATLIPOKA

Golden Member
May 1, 2020
1,450
1,729
106
1. They are maximizing their profits, not doing favors to the users. These dies are made for a market range that is on the lower end, their reasoning is that if you need more FPS, you'll buy the higher end GPUs for bigger notebooks and pay more. Most of the laptop user do not even know a third of the technical details we are discussing here and in any case most people do not want to spend 2K$ or more on a mainstream laptop. Same reason for AMD going to produce N33 on N6 instead of N5 and even getting a smaller chip than the previous generation one. Margins and keeping costs low especially on lower end parts.

2. HBM on a mainstream product? Sure. There must be a reason it does not happen.

3. AD106 is not a big die in absolute terms, it is less than 200mm^2. And as such, even finding the space for the pins of a memory bus larger than 128 bit with a standard packaging is probably difficult.

In any case, the 4050/4060 and the 4080 mobile (especially the latter) seem to be the best deals, they offer better performance and perf/W than the older gen and will probably sold at the same price. 4080 Mobile could be my next GPU, it will depend on what N32 will be.

Edit: it is anyway a bit of scandal because looking at the prices on barebone laptops configurators the price of these mobile cards is quite high, on par with their desktop version while being considerably less powerful and even without the massive cooler of the desktop VGA. But alas, not that there is a choice there (maybe when N32 will arrive but I doubt it).
1. I just checked prices in the biggest shop in my country and from 14 laptops with RTX 4070 eleven are over €2000 and only a single one is under €1850, precisely It costs €1659 including taxes. I can understand maximizing profit and keeping cost down, but these prices don't look lower end to me.

2. Not sure what is the reason, price or supply? But using HBM would allow more Vram and also save die space, motherboard space and power.

3. GTX 1060 was 200mm2 and had 192-bit bus. If you added additional 64-bit Phy + 4SM or more ROPs you would have the needed size for those pins.

4. RTX 4070 is the worst of them, true. I myself want to buy a new laptop, but paying €2499 just to have RTX 4080 with 12GB Vram is not to my liking, especially when It comes with only 16GB DDR5 + 512GB SSD. I would at least wait for N32, to see Its performance, but am pretty sceptical about price.
 

leoneazzurro

Senior member
Jul 26, 2016
881
1,395
136
1. I just checked prices in the biggest shop in my country and from 14 laptops with RTX 4070 eleven are over €2000 and only a single one is under €1850, precisely It costs €1659 including taxes. I can understand maximizing profit and keeping cost down, but these prices don't look lower end to me.

2. Not sure what is the reason, price or supply? But using HBM would allow more Vram and also save die space, motherboard space and power.

3. GTX 1060 was 200mm2 and had 192-bit bus. If you added additional 64-bit Phy + 4SM or more ROPs you would have the needed size for those pins.

4. RTX 4070 is the worst of them, true. I myself want to buy a new laptop, but paying €2499 just to have RTX 4080 with 12GB Vram is not to my liking, especially when It comes with only 16GB DDR5 + 512GB SSD. I would at least wait for N32, to see Its performance, but am pretty sceptical about price.

1. Yes, but there is also the novelty premium. It is quite probable that models with lower prices will appear, but lower prices will be anyway way more than 1K€ for that class of machines, I think around 1,3K€-1,5K€ (i.e. TUF series, HP Omens and so on) is a safe bet. And of course there will be even pricier laptops with this GPU depending on build quality and configuration.

2. I'd say both, HBM costs are still high, availability has improved but datacenter products are absorbing the most of the production, and there is also need of a more expensive packaging.

3. Yes, GTX1060 was a little bigger than AD106 and had a 192 bit bus, but it was for GDDR5 and I personally don't know if GDDR6 needs additional contacts respect to the previous standard, or if there is additional need of pins for other signalling, not even mentioning the fact that memory interfaces are the parts which scale the worst with process, so having a bigger memory bus would have lead to bigger die size increases than we can estimate atm (die shots in this case could help) on the very expensive 4N process. In any case, the balance of these dies is decided according to the market and not only for always maximizing the throughput of the die itself

4. Same, I can afford and I change laptop every 3/4 years but sparing money cannot be wrong.
 
  • Like
Reactions: TESKATLIPOKA

guidryp

Diamond Member
Apr 3, 2006
3,430
4,367
136
Die size is only 19% bigger and 50% more SM results in only 20-25% better performance, so It has a bottleneck, but lower clock is also at fault here.
With only 128-bit 8GB It's not a very appealing product.

Die Size: Early reports yet, and even if the final size does come out that close, it's getting into the territory where you have a bunch of things that do not shrink when you remove SMs. 2D sections, Media Encoders, Memory controllers, etc... Their real problem here isn't about the AD106 not being bigger, it's more about the AD107 not being smaller. This is after all a game of making your die as small as you can get away with.

Performance: You are looking at early reports from laptops that have MANY limitations. This will also be in discrete cards where performance will open up more, and be consistent. Then we can see a real comparison.

Memory Bus: If you upped it to 192 bits bus, then you would be complaining about AD104 only having 192 bits... The reality is the whole lineup above the AD107 is cutting it close on Memory Bus, because like many things it's an engineering tradeoff, and memory controllers don't really scale, and eat up die space.
 
  • Like
Reactions: TESKATLIPOKA

jpiniero

Lifer
Oct 1, 2010
13,418
4,471
136
The fixed function part of the Ada die is quite a lot; and the OFA seemingly takes up a decent amount of space too.

It's possible that AD107 was intended to be 96-bit and 16 MB of L2 but ultimately decided that wasn't good enough or it didn't work out shapewise.
 
  • Like
Reactions: TESKATLIPOKA

TESKATLIPOKA

Golden Member
May 1, 2020
1,450
1,729
106
Die Size: Early reports yet, and even if the final size does come out that close, it's getting into the territory where you have a bunch of things that do not shrink when you remove SMs. 2D sections, Media Encoders, Memory controllers, etc... Their real problem here isn't about the AD106 not being bigger, it's more about the AD107 not being smaller. This is after all a game of making your die as small as you can get away with.

Performance: You are looking at early reports from laptops that have MANY limitations. This will also be in discrete cards where performance will open up more, and be consistent. Then we can see a real comparison.

Memory Bus: If you upped it to 192 bits bus, then you would be complaining about AD104 only having 192 bits... The reality is the whole lineup above the AD107 is cutting it close on Memory Bus, because like many things it's an engineering tradeoff, and memory controllers don't really scale, and eat up die space.
1.) You are right about AD107 being pretty big for these specs.
If I removed the same die area as the difference between the supposed sizes of AD106 and AD107, then I would end up with 126mm²(186-156) and 12SM,1536Cuda,48TMU and 128-bit, not sure about ROPs or L2. That's a very big chip for those specs while using N4, performance would be also bad.

2.) Of course, comparing desktop parts would be by far the best, but we will have to wait a bit for that. I think they could be clocked very high, but BW will be a limitation. I wouldn't be surprised If Nvidia will use GDDR6x or at least 20gbps GDDR6.

3.) Yes, then I would prefer AD104 with 256-bit, AD103 with 320-bit, because I would change the specs to these:
AD106 -> 48SM, 6144Cuda, 192TMU, 64ROP, 32MB L2, 192-bit, 12GB Vram
AD104 -> 72SM, 9216Cuda, 288TMU, 96ROP, 48MB L2, 256-bit, 16GB Vram
AD103 -> 96SM, 12288Cuda, 384TMU, 128ROP, 64MB L2, 320-bit, 20GB vram
Yes, I know this won't happen.
 

Aapje

Golden Member
Mar 21, 2022
1,004
1,213
96
because like many things it's an engineering tradeoff, and memory controllers don't really scale, and eat up die space.

That's all fine and good, but if it makes for an unattractive product, it's a bad tradeoff.

For me, the VRAM levels of the Nvidia cards have rapidly become unacceptable, especially at these price points.

Their real problem here isn't about the AD106 not being bigger, it's more about the AD107 not being smaller.

And the prices they ask for it. Small dies require small prices.
 

guidryp

Diamond Member
Apr 3, 2006
3,430
4,367
136
That's all fine and good, but if it makes for an unattractive product, it's a bad tradeoff.

For me, the VRAM levels of the Nvidia cards have rapidly become unacceptable, especially at these price points.

If the forum opinions about how unattractive NVidia cards are actually mattered , then NVidia would have went out of business years ago.

Clearly the opinions of the large bulk of people that buy cards are drastically different than those in this forum.
 

Aapje

Golden Member
Mar 21, 2022
1,004
1,213
96
If the forum opinions about how unattractive NVidia cards are actually mattered , then NVidia would have went out of business years ago.

This complaint is actually more relevant for the current gen, especially when the 4060 will get 8 GB. But you don't seem to actually be reading my comment and are apparently just responding to your underbelly.
 

guidryp

Diamond Member
Apr 3, 2006
3,430
4,367
136
This complaint is actually more relevant for the current gen, especially when the 4060 will get 8 GB. But you don't seem to actually be reading my comment and are apparently just responding to your underbelly.

A 60 series having 8GB is totally fine. You aren't going to run Max settings 4K on a 60 series.

There are 70/80/90 series for higher resolutions and settings.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,208
2,867
136
A 60 series having 8GB is totally fine. You aren't going to run Max settings 4K on a 60 series.

There are 70/80/90 series for higher resolutions and settings.

NVIDIA is doing fine because people buy NVIDIA even if they cost more and perform worse than AMD cards. Will the 4060 be just fine if it turns out the rumor of it being the same as the laptop 4060 is true? 128 bit memory bus, lower CUDA cores, 115W or so TDP? The 4000 series other than the 4090 has been quite the disappointment. Even then the 4090 is overpriced. It seems like they were counting on DLSS3 selling this generation.

Oh and the 4090 is a power hog, and it and the 4080 are massive. You could probably use them as murder weapons in lieu of a baseball bat.
 
  • Haha
Reactions: ZGR

coercitiv

Diamond Member
Jan 24, 2014
5,686
9,912
136
If the forum opinions about how unattractive NVidia cards are actually mattered , then NVidia would have went out of business years ago.

Clearly the opinions of the large bulk of people that buy cards are drastically different than those in this forum.
Or you're just selectively remembering events in the past few years.

This forum had major issues with the price/performance of the 2000 series. Many here considered it an insult. After "meh" sales, Nvidia decided to launch the SUPER cards, which many said were the products they expected in the original launch. Apparently the opinions of the large bulk of people that buy cards were drastically close to those in this forum.

For the 3000 series this forum also complained some cards had issues with VRAM size, and even with the massive crypto demand eating anything produced... Nvidia decided to add VRAM options.

For the 4000 series we laughed at the 4080 12GB. The market was willing to buy it, but Jensen himself is reading our forums and decided to unlaunch the card. To this day, consumers still apply 4080 stickers over their 4070 Ti cards. Product boxes are also lovingly adjusted with $899 price tags. If only those Anandtech haters hadn't spoken!
 
  • Like
Reactions: Aapje

Aapje

Golden Member
Mar 21, 2022
1,004
1,213
96
Indeed. My prediction is that unless they make a significant change, this is going to be a record-setting generation, with lower sales than any generation in the last 20 years.
 

guidryp

Diamond Member
Apr 3, 2006
3,430
4,367
136
Or you're just selectively remembering events in the past few years.

Look in the mirror.

For the 3000 series this forum also complained some cards had issues with VRAM size, and even with the massive crypto demand eating anything produced... Nvidia decided to add VRAM options.

Like the 1060 3GB/6GB, that isn't just a VRAM option, the 12GB has more cores as well. This didn't come into existence because of forum whining.

Plus that's also just one card. Where is the more VRAM option of 3070/3070 Ti that were so "unattractive" with "only" 8GB, that they sold above MSRP for 2 years, and are still selling for MSRP or above.

The histrionics over 8GB VRAM around here are just plain silly and out of touch.

8GB would be a problem on high end 4K cards, but isn't on lower tier cards.

The top 5 cards on Steam HW survey all have 6GB or VRAM or less. There is no danger of 8GB cards suddenly going obsolete.
 

Aapje

Golden Member
Mar 21, 2022
1,004
1,213
96
Plus that's also just one card. Where is the more VRAM option of 3070/3070 Ti that were so "unattractive" with "only" 8GB, that they sold above MSRP for 2 years, and are still selling for MSRP or above.

Those sales were in large part due to miners and as soon as that market cratered, so did sales. Of course it wasn't unattractive for mining, because AFAIK, 8 GB has always been enough for Ethereum.

So you just proved him right when he complained about you selectively remembering history.

8GB would be a problem on high end 4K cards, but isn't on lower tier cards.

8 GB is fine for 1080p or for very budget 1440p gaming (but the price better reflect that). I think that 10 GB is the absolute minimum for a more serious 1440p card, but 12 GB is more reasonable.

And with the expected performance and prices, I see a 4060 as a fairly serious 1440p card. 8 GB is much more reasonable for the 4050, although Nvidia is probably going to ask way too much again and AMD will again be a way better choice at that tier.
 

guidryp

Diamond Member
Apr 3, 2006
3,430
4,367
136
Those sales were in large part due to miners and as soon as that market cratered, so did sales. Of course it wasn't unattractive for mining, because AFAIK, 8 GB has always been enough for Ethereum.

So you just proved him right when he complained about you selectively remembering history.

GPU mining was obliterated, more than 5 months ago by the Ethereum switch to PoS. 3070 stayed at or above (mostly above) MSRP since then. Also 3070 is already a top 10 card with Steam Gamers (HW survey) so it looks like large amounts of them ended up with gamers.

8 GB is fine for 1080p or for very budget 1440p gaming (but the price better reflect that). I think that 10 GB is the absolute minimum for a more serious 1440p card, but 12 GB is more reasonable.

Meaningless opinion not backed by anything. I checked into the "8GB is not Enough" thread and their main attack on the 3070, seems to be Far Cry 6 running 4K with HD textures, which just seems to be an anomaly...

That isn't exactly strong evidence that it would be a problem at 1440p.
 

Aapje

Golden Member
Mar 21, 2022
1,004
1,213
96
GPU mining was obliterated, more than 5 months ago by the Ethereum switch to PoS. 3070 stayed at or above (mostly above) MSRP since then. Also 3070 is already a top 10 card with Steam Gamers (HW survey) so it looks like large amounts of them ended up with gamers.

And yet overal sales are very low, so this doesn't prove that the 3070 is attractive. Most of the 3000-lineup had low VRAM anyway, aside from the 3090, which was a very bad deal for gamers due to the lacking performance for the extra cost.

Meaningless opinion not backed by anything. I checked into the "8GB is not Enough" thread and their main attack on the 3070, seems to be Far Cry 6 running 4K with HD textures, which just seems to be an anomaly...

That isn't exactly strong evidence that it would be a problem at 1440p.

I see more and more people reporting issues with 8 GB.
 
  • Like
Reactions: igor_kavinski

guidryp

Diamond Member
Apr 3, 2006
3,430
4,367
136
And yet overal sales are very low, so this doesn't prove that the 3070 is attractive. Most of the 3000-lineup had low VRAM anyway, aside from the 3090, which was a very bad deal for gamers due to the lacking performance for the extra cost.

Of the top 10 GPUs on Steam HWS, half are 3000 series, so it looks like 3000 series sold VERY well to gamers.
 
Last edited:
  • Like
Reactions: GodisanAtheist

leoneazzurro

Senior member
Jul 26, 2016
881
1,395
136
Deep Analyze RTX4050 vs RTX 4060 vs RTX4070 of Gaming Laptops! | BIBA Laptops - YouTube

A more comprehensive roundup of the 4050, 4060 and 4070 mobile performance in several games and tests.

EDIT: also, there is the explanation for the lower than expected performance of some GPUs: practically, Nvidia set another limit that kicks in before the power limit, and that is a limit on voltage, which maxes out the clock at a set limit per card even if the power limit is not encountered. This means that, even if the theoretical power limit is 140W, these GPUs consume around 100-110W and no more. If you want to get more perf, you have to overclock them manually and possibly find a way to overcome the voltage wall. Now, about the why there is a voltage wall we can only speculate...

Really btw, this review is something else... Kudos to the authors, it was really complete and informative.
 
Last edited:

TESKATLIPOKA

Golden Member
May 1, 2020
1,450
1,729
106
Notebookcheck.net
Screenshot_11.png
Screenshot_12.png
43% difference in Time Spy at 115W.
39% difference in Witcher 3 at 115W.
40% difference in Cyberpunk at 115W.
You could say RTX 4050 is doing surprisingly well.
On the other hand, RTX 4070 needs only 1/2 TDP to perform as 100W RTX 4050, that's also very nice.
 
  • Like
Reactions: Hitman928

Thunder 57

Platinum Member
Aug 19, 2007
2,208
2,867
136

ASK THE COMMUNITY

TRENDING THREADS