Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 42 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

leoneazzurro

Senior member
Jul 26, 2016
905
1,430
136
have we seen die size estimates yet or can we make assumptions from cudacore counts?

this feels like another small die pretending to be a cut down big die naming masquerade like the 3070, only worse since it is pretending to be the xx80.

The figure of 608 mm^2 for the AD102 seems confirmed, I have not seen yet estimates for the AD104.
 
Last edited:
  • Like
Reactions: Tlh97 and Leeea

exquisitechar

Senior member
Apr 18, 2017
655
862
136
You know things are bad, when people are fantasizing how Lisa Su will launch the 7900XT.
Those people are going to be let down. AMD will undercut Nvidia at the top end, but not by much if they are competitive. AD102's raster performance shouldn't be difficult to match, but they'll probably lose in RT and won't have DLSS3 and such, for whatever it's worth...personally, I don't care about DLSS3 since I expect it to have worse image quality than DLSS2.x. You can already see the temporal artifacts in Digital Foundry's "preview".

Navi33 has the potential to be interesting and Navi32 should be a strong offering as well. If Navi33 is really cut down Navi21 performance at 1080p and maybe 1440p while being such a small and cheap N6 die, it will make Ada look like a joke. In the end, AMD doesn't seem to care about getting market share in the desktop market, so I wouldn't expect too much from them, though.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
DLSS 3 looks very impressive tbh - complete interpolation from 2 key frames. Because it's completely out of the rendering pipeline it will just double the frame rate no matter what the bottleneck is, which you can see from things like MS flight simulator which is cpu bound. I await to see how good the quality is, but assuming it is decent then it's an amazing step forward.
 
  • Like
Reactions: xpea

Tup3x

Senior member
Dec 31, 2016
944
925
136
DLSS 3 looks very impressive tbh - complete interpolation from 2 key frames. Because it's completely out of the rendering pipeline it will just double the frame rate no matter what the bottleneck is, which you can see from things like MS flight simulator which is cpu bound. I await to see how good the quality is, but assuming it is decent then it's an amazing step forward.
It has potential to be really nice feature but I really want to know how's the latency and if there are obvious visual artifacts. It's is good that they are trying these things. Maybe some day frame interpolation is part of graphics api together with upsampling.
 

marcUK2

Member
Sep 23, 2019
74
39
61
The slides and the DLSS3 video is typical marketing sillyness. Who runs RT on without DLSS? Someone at 1080p with a 2080ti, maybe a 3070 or better? Why not show DLSS 2.0 3080 vs 3.0 on the 40 series card? Why does the lower left side say RTX is off but under the 21fps say RT is On?

I was just interested in a completely fair comparison to determine the raw performance increase in the new chip considering the transistor count has tripled.
No doubt DLS3 will be enabled in any gaming scenario with RT.

But I wonder why the raw performance isn't tripled?
Why not just take the 3090Ti chip architecture and triple it ? Wouldnt that give a better performace all things being equal?
 
  • Like
Reactions: Tlh97 and Ranulf

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
It has potential to be really nice feature but I really want to know how's the latency and if there are obvious visual artifacts. It's is good that they are trying these things. Maybe some day frame interpolation is part of graphics api together with upsampling.
Latency in general shouldn't be an issue more than it is today, obviously lots of things are not dependent on frame rate anyway for latency (e.g. input lag, ping, game/server tick rate is independent of fps) but where you do get it is for things like buffering several frames ahead for smoothness. I don't see why they need to do that in particular here but we will see.
I was just interested in a completely fair comparison to determine the raw performance increase in the new chip considering the transistor count has tripled.
No doubt DLS3 will be enabled in any gaming scenario with RT.

But I wonder why the raw performance isn't tripled?
Why not just take the 3090Ti chip architecture and triple it ? Wouldnt that give a better performace all things being equal?
If DLSS 3 essentially always doubles the frame rate by using just the small amount of the die given to tensor cores it would seem to be a spectacularly good use of die space?
As to why performance increase of everything else isn't linear, well you have to look at where the bottlenecks are - if you make a 3x3090ti but don't increase the memory bandwidth, or the cpu performance or any of those other things you don't get 3x the performance, you only get the increase until your worst bottleneck stops it increasing further. Hence the chip designers have to use the available transistors as best as possible to balance those bottlenecks (e.g. the 4090 probably using a load more die space for caching to offset the memory performance limitations).
 

jpiniero

Lifer
Oct 1, 2010
14,509
5,159
136
On the memory bandwidth issue, remember that Samsung promised 27 gbps GDDR6+ and 32 gbps GDDR7 and have delivered neither.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
Latency in general shouldn't be an issue more than it is today, obviously lots of things are not dependent on frame rate anyway for latency (e.g. input lag, ping, game/server tick rate is independent of fps) but where you do get it is for things like buffering several frames ahead for smoothness. I don't see why they need to do that in particular here but we will see.

That's the problem. If you were just passively watching 60 FPS or 120 FPS you mostly likely couldn't tell much difference, and you really wouldn't care about any minor difference you struggled to see.

Where the difference really matters, is in the feel, and reactivity to your inputs. When you start flicking your gun around to a target and 120 FPS feels vastly faster, and able to track you.

DLSS 3 frame fakery might superficially look like 120 FPS, if you were just passively watching, which really doesn't even matter, but it will utterly fail to feel like 120 FPS, which is exactly where it does matter. DLSS 3 120 FPS will still feel like 60 FPS.

IMO, it's more a fake FPS boost, for marketing purposes, than an actual benefit for gaming purposes.

If NVidia succeeds in getting reviewers to simply start quoting DLSS 3 frame rates like they were real, it's the (slimy) marketing coup of the decade.
 

Karnak

Senior member
Jan 5, 2017
399
767
136
DLSS 3 frame fakery might superficially look like 120 FPS, if you were just passively watching, which really doesn't even matter, but it will utterly fail to feel like 120 FPS, which is exactly where it does matter. DLSS 3 120 FPS will still feel like 60 FPS.

IMO, it's more a fake FPS boost, for marketing purposes, than an actual benefit for gaming purposes.
Pretty much this and I can't wait for the actual reviews pointing that one out.

Blind test between 120fps DLSS3 and 120fps native and I can guarantee you that even people who are not into the hardware business and are just more like casual gamers will know/feel that there's something wrong although both numbers are the same.
 

Revolution 11

Senior member
Jun 2, 2011
952
79
91
Price is a gut punch. I literally felt my stomach sink when I saw them. If AMD follows suit, I don't think I can be an enthusiast anymore.

I thought I would be angrier, you know, Nvidia being scumbags is a meme and all but I just feel depressed lol.
And for $1600, you don't even get DisplayPort 2.0 outputs. Nvidia has seen it fit to stick to DP 1.4a.

Complete insanity.
 
  • Like
Reactions: Tlh97 and Leeea

Revolution 11

Senior member
Jun 2, 2011
952
79
91
So it sounds like DLSS 3.0 is just to falsely jack up frame rates by creating frames. It sounds like it would make 60fps suddenly look like 120fps. And of course its 40 series only, so the 40 series will look way better than 30 series.
What exactly is the non-RT, non-DLSS 3.0 pure rasterization performance improvement vs Amphere? I see a small mention of 2x in some articles but there is no emphasis on this number.
 

Revolution 11

Senior member
Jun 2, 2011
952
79
91
Maybe my reading comprehension fell off a truck but when are we going to get final specs. So to clarify we are getting a PCIe gen 4 card, that will require a new 3.0 PSU for required power cable, and most likely have beta drivers that need to be tweaked for DLSS.

If AMD brings out a Gen5 card in November they will be a better option going forward. If your building a new system using AM5 do you really want a Gen 4 card, especially if a serious hobbyist.

R81Z3N1
And do you want a "next-gen" card that doesn't even have DP 2.0? I don't know what Nvidia was thinking here, at least pretend to future-proof your new cards.
 
  • Like
Reactions: Tlh97 and Leeea

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
What exactly is the non-RT, non-DLSS 3.0 pure rasterization performance improvement vs Amphere? I see a small mention of 2x in some articles but there is no emphasis on this number.

I don't think we will know the actual improvements for anything until some reputable reviewers get their hands on them. The numbers they put out with DLSS 3.0 being used can be thrown right out.

And we don't really have much for non-RT games in the stuff they released. The whole presentation was like 4 minutes. As if it was some sort of magic shows and they didn't want anybody to watch too long or they might catch on.
 

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
If NVidia succeeds in getting reviewers to simply start quoting DLSS 3 frame rates like they were real, it's the (slimy) marketing coup of the decade.

What else would your really expect from NV?

After seeing these numbers, are we really surprised EVGA called quits? The AIBs will have to buy the chips from NV for an outrages price to be available at launch and then probably after AMD launch will immeditaley have to drop price by several 100 and take the loss themselves.
 

Revolution 11

Senior member
Jun 2, 2011
952
79
91
Nice to know if I buy into an Nvidia feature it'll be replaced in a couple of years.
Well that and the fact that developers (outside of the handful of AAA titles that have direct Nvidia support) rarely target features that only a very small percentage of gamers have hardware support for. By the time DLSS 3.0 is in the hands of a sufficient minority of gamers and there are more than a couple of games that support DLSS 3.0, it will be 2025 or later.
 

jpiniero

Lifer
Oct 1, 2010
14,509
5,159
136
What else would your really expect from NV?

After seeing these numbers, are we really surprised EVGA called quits? The AIBs will have to buy the chips from NV for an outrages price to be available at launch and then probably after AMD launch will immeditaley have to drop price by several 100 and take the loss themselves.

I still don't get why people think AMD will go cheap.

Besides, according to EVGA they told nVidia back in April they were getting out. Which... based upon what they said ... it would have been before they would have been told any real details about how nVidia was planning to price the 40 series.

I'd say EVGA's exit is indeed mostly about wanting to get away from nVidia but also uncertainty over mining and especially when (if?) you will see "mid range" parts from either nVidia or AMD because of mining. I bet you will see EVGA make AMD GPUs but they will sit this gen out.
 
  • Like
Reactions: SMU_Pony and Leeea

Furious_Styles

Senior member
Jan 17, 2019
492
228
116
If NVidia succeeds in getting reviewers to simply start quoting DLSS 3 frame rates like they were real, it's the (slimy) marketing coup of the decade.

They have been doing this for a while, and to a certain extent all companies try to tilt the deck in their favor with this kind of stuff. Intel certainly does. That's why waiting for independent reviews before buying is always the best bet.
 
  • Like
Reactions: Tlh97 and Leeea

fleshconsumed

Diamond Member
Feb 21, 2002
6,483
2,352
136
Well, 24 hours later, internet reaction to the 40 series announcement has been overwhelmingly negative. People are rightly upset about 4070>4080 renaming along with 80% price hike. Quite a few are saying they've had enough and are switching to AMD, especially with EVGA gone. It'll be interesting how it shakes out. There is a glut of 30 series on the market and they aren't exactly selling even at their current discounted prices now that there is no mining demand. I don't really see 40 series selling well in the current market conditions. I think they'd sell well if nvidia only hiked the price by $100, but they did a flat $400-500 price increase pushing high mid-range solidly into the $1000 territory. The price is just too damn high, and there will be more and more used cards coming on the market in the coming months making 40 series even less appealing.

I know that nvidia doesn't care since bulk of their money comes from datacenter CUDA sales, and I guess they're betting on the "mindshare" in the gaming segment, but reading all the reactions their "mindshare" is taking a real hit right now. Gamers have been struggling to find an affordable card for the past two years, they're tired and cranky waiting for a viable upgrade, hiking the next gen price by 80% is a real punch in the nuts right now. We'll see what happens to nvidia mindshare in the coming years. A lot will depend on AMD and what their RDNA3 looks like.

Personally, I'll be waiting for N31/N32 details. I'm certain AMD will undercut nvidia as usual. The question is by how much, as somebody has said nvidia left a hole big enough for a dump truck to drive through with their pricing.
 
Last edited:

SteveGrabowski

Diamond Member
Oct 20, 2014
6,792
5,752
136
Well, 24 hours later, internet reaction to the 40 series announcement has been overwhelmingly negative. People are rightly upset about 4070>4080 renaming along with 80% price hike. Quite a few are saying they've had enough and are switching to AMD, especially with EVGA gone. It'll be interesting how it shakes out. There is a glut of 30 series on the market and they aren't exactly selling even at their current discounted prices now that there is no mining demand. I don't really see 40 series selling well in the current market conditions. I think they'd sell well if nvidia only hiked the price by $100, but they did a flat $400-500 price increase pushing high mid-range solidly into the $1000 territory. The price is just too damn high, and there will be more and more used cards coming on the market in the coming months making 40 series even less appealing.

I know that nvidia doesn't care since bulk of their money comes from datacenter CUDA sales, and I guess they're betting on the "mindshare" in the gaming segment, but reading all the reactions their "mindshare" is taking a real hit right now. Gamers have been struggling to find an affordable card for the past two years, they're tired and cranky waiting for a viable upgrade, hiking the next gen price by 80% is a real punch in the nuts right now. We'll see what happens to nvidia mindshare in the coming years. A lot will depend on AMD and what their RDNA3 looks like.

Freaking ethereum. Showed Nvidia they could deliver us huge price hikes in Turing and now Lovelace. Never thought I'd see a worse card than the 2080 but JFC at the 4080 12GB.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
I still don't get why people think AMD will go cheap.

Yeah, AMD under Lisa Su, has always priced in line with NVidia, maybe a small discount. But hope springs eternal, maybe they have a real cost advantage with chiplets, and decide to kick Nvidia while they are down (mining crash).


Besides, according to EVGA they told nVidia back in April they were getting out. Which... based upon what they said ... it would have been before they would have been told any real details about how nVidia was planning to price the 40 series.

Yeah, I watched the videos, I buy that the CEO was just sick of NVidia business practices across the board, he was way past the last straw. Nothing unique about the upcoming generation, just more of the same old crap, and before signing more contracts for a new generation was the time to bail out.


I bet you will see EVGA make AMD GPUs but they will sit this gen out.

IMO, they really can't sit out a generation, as they will have to let most of the GPU staff go. They need to be in talks soon with AMD or I expect they are really out of GPUs for good.

Also I would think AMD would be quite interested in partnering with them. A nice dig at NVidia if Lisa Su could welcome new board partner EVGA during the Radeon 7000 series release presentation.

Significantly lower priced 4090 competitor and signing up EVGA to build them, that would be so sweet. :D
 

Thunder 57

Platinum Member
Aug 19, 2007
2,647
3,706
136
Well, 24 hours later, internet reaction to the 40 series announcement has been overwhelmingly negative. People are rightly upset about 4070>4080 renaming along with 80% price hike. Quite a few are saying they've had enough and are switching to AMD, especially with EVGA gone. It'll be interesting how it shakes out. There is a glut of 30 series on the market and they aren't exactly selling even at their current discounted prices now that there is no mining demand. I don't really see 40 series selling well in the current market conditions. I think they'd sell well if nvidia only hiked the price by $100, but they did a flat $400-500 price increase pushing high mid-range solidly into the $1000 territory. The price is just too damn high, and there will be more and more used cards coming on the market in the coming months making 40 series even less appealing.

I know that nvidia doesn't care since bulk of their money comes from datacenter CUDA sales, and I guess they're betting on the "mindshare" in the gaming segment, but reading all the reactions their "mindshare" is taking a real hit right now. Gamers have been struggling to find an affordable card for the past two years, they're tired and cranky waiting for a viable upgrade, hiking the next gen price by 80% is a real punch in the nuts right now. We'll see what happens to nvidia mindshare in the coming years. A lot will depend on AMD and what their RDNA3 looks like.

Personally, I'll be waiting for N31/N32 details. I'm certain AMD will undercut nvidia as usual. The question is by how much, as somebody has said nvidia left a hole big enough for a dump truck to drive through with their pricing.

The "mindshare" is ridiculous considering a 3050 costs the same or more than an RX 6600 that will beat the snot out of it. That is the problem.