Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 38 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

exquisitechar

Senior member
Apr 18, 2017
657
871
136
05f6e5ba8aa901aebc366db3745f6484e574596be01fc7f6e4687b32b1629fae.jpg
Looks like the Twitter leakers will have eggs on their faces.
 
  • Haha
Reactions: Kaluan

SteveGrabowski

Diamond Member
Oct 20, 2014
6,887
5,824
136
So wait, after all this time we have a 4080 12GB at $899 MSRP that just ties the 3090 Ti with DLSS3.0 off? Am I reading that correctly?
The same 3090 Ti that's only like 10% faster at 4k than a 12GB 3080 Ti, a card currently selling for under $750?

Woof.

LOL glad I didn't watch this garbage. What a joke, I hope Nvidia has to eat all those 30 series chips those vultures still have in stock.
 
  • Like
Reactions: Kaluan

Heartbreaker

Diamond Member
Apr 3, 2006
4,227
5,228
136
With the recent ones, sure. The performance doesn't. >2-2.5x GA102/Navi21 performance seems like a delusion right now.

Yes, but NVidia is actually claiming up to 4X performance in the video. So the leakers are basically confirming what the video says.

Yes, that claim is misleading, but I wouldn't blame the leakers for that. They had both the specs and NVidia's outlandish performance claims correct.

Previous attachment showing the up to 4x with RT and DLSS 3:

View attachment 67830
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
There is no freaking way NV doesn't do a massive price drop when the Ampere inventory sells out or does the mid gen refresh "Ti model 1 tier down is within 5% performance of the non-ti model 1 tier up" shenanigans.
Ampere production on Samsung 8nm is continuing. They aren't going top "sell out" of Ampere anytime soon. Total guess but I see Ampere production continuing well into 2023 and likely until 4N is the new "value" node which would be 2024 when 3nm is beginning to start production on PC parts.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,483
2,352
136
So wait, after all this time we have a 4080 12GB at $899 MSRP that just ties the 3090 Ti with DLSS3.0 off? Am I reading that correctly?
The same 3090 Ti that's only like 10% faster at 4k than a 12GB 3080 Ti, a card currently selling for under $750?

Woof.
nvidia priced 40 series to sell 30 series. MSRP wise 4090 is the only one that brings real meaningful increase in performance/$ metric. All other cards have very little to no increase in performance/$ metric. As expected from an arrogant monopolistic company that has a glut of 30 series to sell.
 

Saylick

Diamond Member
Sep 10, 2012
3,157
6,378
136
Very unlikely to fit 76B in ~600mm2 which is 126 MTr/mm2 (iPhone SoC is 132 MTr/mm2) , Clocks will get nerfed. Hopper is only 98 MTr/mm2.
Your logic makes sense. We'll just have to wait until we get a teardown to confirm. Speaking of die shot, I didn't see what should be an obvious 96 MB of L2 cache on the die shot that was provided by Nvidia. I know it's a render, but even the render didn't show an obvious bank of cache.
 

Borealis7

Platinum Member
Oct 19, 2006
2,914
205
106
the GTC presentation is VERY impressive. say what you will about nVidia - expensive, power hungry, greedy, whatever - the technology and the tools they have developed are AMAZING. this goes far beyond gaming. nVidia is operating in the upper atmosphere of AI and simulations while AMD is still benching Forza Horizon. i'm really wondering if these cards are aimed at gamers at all.
 

Saylick

Diamond Member
Sep 10, 2012
3,157
6,378
136
Is it just me or are there no IPC gains for Lovelace?

Full AD103 = Full GA102 = 84 SM = 10752 shaders
4080 16 GB = 76 SM = 9728 shaders
Ratio = ~0.91

Rated boost clocks for 4080 16GB = 2.5 GHz (actual clocks likely higher)
Rated boost clocks for 3090 Ti = ~1.86 GHz (actual clocks are higher at ~2 GHz)
Ratio = ~1.34

Final ratio, excluding the reduction in memory bandwidth due to a smaller 256-bit bus = 1.22x
 
  • Like
Reactions: Kaluan

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I feel strange having to ask this question, but what are GPUs used for these days? Why would someone buy a 4090 or 4080? For gaming? Gaming how? Because they want ray tracing in 4K with high FPS? How many people can be bothered to care about chasing that crap? Honestly.
 

CP5670

Diamond Member
Jun 24, 2004
5,511
588
126
This is not that impressive compared to all the "leaks." Only the 4090 is a real step forward from current cards. It's cheaper than I was expecting though. I might get one but am not sure about the power and heat output.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
I feel strange having to ask this question, but what are GPUs used for these days? Why would someone buy a 4090 or 4080? For gaming? Gaming how? Because they want ray tracing in 4K with high FPS? How many people can be bothered to care about chasing that crap? Honestly.
Filling out PC cases with cool looking parts obviously
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
I feel strange having to ask this question, but what are GPUs used for these days? Why would someone buy a 4090 or 4080? For gaming? Gaming how? Because they want ray tracing in 4K with high FPS? How many people can be bothered to care about chasing that crap? Honestly.
Exactly. For ANYONE playing competitive online games, you don't go higher than 1080p + 240hz. I suppose dudes playing single-player games want to sit there and stare at 4k with maximum effects but yes - we are at the point of diminishing returns. You can scoop up a vanilla 3080 and be set for competitive games, where frame rates actually matter, for the next 4 years.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
I see they finally removed the symbolic Nvlink connector even on 4090. @AdamK47 thats a real kick in the pants.....well no, not really. Its been dead for years. You grabbing one day one with me?
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
I feel strange having to ask this question, but what are GPUs used for these days? Why would someone buy a 4090 or 4080? For gaming? Gaming how? Because they want ray tracing in 4K with high FPS? How many people can be bothered to care about chasing that crap? Honestly.

Machine learning stuff, broadly, I think. Hence them building some of these chips up so big/expensive. Perhaps AR/VR at some future point?

You can definitely see why they started focusing into RT. Without it, even at 4K, there just wouldn't be a remotely sane reason to have the top end cards. With it its just about possible to believe things if you've got the money & don't care about the power draw.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Exactly. For ANYONE playing competitive online games, you don't go higher than 1080p + 240hz. I suppose dudes playing single-player games want to sit there and stare at 4k with maximum effects but yes - we are at the point of diminishing returns. You can scoop up a vanilla 3080 and be set for competitive games, where frame rates actually matter, for the next 4 years.

Yes very much. Outside of niche needs like high resolution VR headsets and flight sims, the whole thing is feeling pretty deflated.
 

jpiniero

Lifer
Oct 1, 2010
14,592
5,214
136

Of the AIB PRs that Videocardz has, this is the only one I saw that mentions pricing. Even then... Zotac has 3 different cards each, one of which says it's the MSRP and the others are presumably more than that.
 
  • Like
Reactions: Kaluan and RnR_au