Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 39 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

amenx

Diamond Member
Dec 17, 2004
3,892
2,102
136
Jensen is just trolling ppl with these prices. NV is committed to $ billions in wafers from TSMC and no way they are going to move these sort of volumes at these prices. And this coming in during a recession, falling PC and components demand, and no mining demand that spurred these sort of prices in the first place.

Jensen is just testing the waters and hoping that gullible buyers will bite at the initial pricing. I think prices will fall over the coming months as they will have to adjust to their supply-demand situation.
 

eek2121

Platinum Member
Aug 2, 2005
2,930
4,025
136
Exactly. For ANYONE playing competitive online games, you don't go higher than 1080p + 240hz. I suppose dudes playing single-player games want to sit there and stare at 4k with maximum effects but yes - we are at the point of diminishing returns. You can scoop up a vanilla 3080 and be set for competitive games, where frame rates actually matter, for the next 4 years.

I don’t know of a single gamer, competitive or otherwise, that still runs 1080p. Most run 1440p or some form of Ultrawide monitor. I even know a pro dota player, and he runs 1440p. Most “competitive” games have been playable at high framerates with resolutions up to 4k since the 1080ti.

Outside of that, this whole launch was “meh”. Sticking with my 3090 and waiting for AMD’s announcement.
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
Your logic makes sense. We'll just have to wait until we get a teardown to confirm. Speaking of die shot, I didn't see what should be an obvious 96 MB of L2 cache on the die shot that was provided by Nvidia. I know it's a render, but even the render didn't show an obvious bank of cache.
It's L2, so much more distributed throughout the die.
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
I feel strange having to ask this question, but what are GPUs used for these days? Why would someone buy a 4090 or 4080? For gaming? Gaming how? Because they want ray tracing in 4K with high FPS? How many people can be bothered to care about chasing that crap? Honestly.
Consider this proof that you're an old gramps now.
 

coercitiv

Diamond Member
Jan 24, 2014
6,185
11,851
136

Saylick

Diamond Member
Sep 10, 2012
3,125
6,294
136
It's L2, so much more distributed throughout the die.
Mmmmm, is it though? For H100, it looks like the split-L2 is truly split within two banks. I'd expect that the 96 MB of L2 for AD102, if it exists, to be clumped together in that middle band of the die.
1663700650070.png
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
Jensen is just trolling ppl with these prices. NV is committed to $ billions in wafers from TSMC and no way they are going to move these sort of volumes at these prices. And this coming in during a recession, falling PC and components demand, and no mining demand that spurred these sort of prices in the first place.

I have long since given up believing that people won't pay exorbitant pricing for GPUs. It wasn't just miners buying insanely priced GPUs during the last couple of years. Gamers were paying those prices as well.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
I have long since given up believing that people won't pay exorbitant pricing for GPUs. It wasn't just miners buying insanely priced GPUs during the last couple of years. Gamers were paying those prices as well.

Because they were expecting to make up the extra cost from mining.
Lets see how many will pay those prices for the new RTX4000 today without the mining.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
I have long since given up believing that people won't pay exorbitant pricing for GPUs. It wasn't just miners buying insanely priced GPUs during the last couple of years. Gamers were paying those prices as well.
correct
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,483
2,352
136

biostud

Lifer
Feb 27, 2003
18,237
4,755
136
I have long since given up believing that people won't pay exorbitant pricing for GPUs. It wasn't just miners buying insanely priced GPUs during the last couple of years. Gamers were paying those prices as well.
Because there were a severe supply issue?
 

DiogoDX

Senior member
Oct 11, 2012
746
277
136
If you think the price is high just look on the Brazil recommended prices:

Today: 1 USD = 5.15 BRL

4090 15000 BRL ~ 2900 USD
4080 11000 BRL ~ 2100 USD
4080 12GB 8200 BRL ~ 1590 USD
 
  • Wow
Reactions: Elfear

HurleyBird

Platinum Member
Apr 22, 2003
2,684
1,267
136
My brain is having trouble digesting the 40 series performance numbers. The 4090 numbers look like they are about where they should be given the specs if you ignore DLSS 3, but if you assume DLSS 3 frame insertion is granting about ~1.5x performance (taking the Cyberpunk demo and rounding down a bit to be conservative) then the 4090 looks like it's only around ~33% faster than the 3090 Ti (I think PCWorlds 25% napkin math is probably too pessimistic). The scaling from 4070 -> 4080 -> 4090 feels very underwhelming given the large spec differences. Other slides make it looks like raster improvement could be >50%, but don't mention which games use frame insertion, only that it is applied "when applicable." It's clear that there isn't a 2x improvement here. Probably somewhere in the 1.33x-1.66x range given the large marketing BS induced error bars.

Honestly, minus the memory capacity limitation and plus some new features, to me Ada feels like it could be Nvidia's Fury X moment. Fury X looked around twice as fast as the 290X on paper, but in reality it was closer to 30%. Just as AMD did back in the day, I wonder if Nvidia has also hit an architectural wall.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,056
136
www.teamjuchems.com
Having recently bought a "fun" car (I need to update that thread) honestly... ¯\_(ツ)_/¯ on these prices.

If PC gaming is a top 3/4 hobby of yours then even a $1,500 GPU (especially if you bother to sell your old GPU for anything) is just the price of having the best way to experience your hobby now.

I used to think that was silly to get so little "extra" for so much money. Now I can appreciate the mindset of buying on top and staying there, because why not?

If you've got some disposable income and want to make this a priority, well then by all means. It's still a relatively cheap hobby.

Finally, NVIDIA has to extract value for their shareholders. Having "successfully" moved top tier MSRP to well above $1,500 I feel like they will only give it back after thoroughly testing it. From a merchandising standpoint it makes total sense, even if that top product is like 1% of their volume. It makes all their "value" plays below it appear to have much better utility for the buck and therefore more attractive to those shoppers who prize value.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
Because they were expecting to make up the extra cost from mining.
Lets see how many will pay those prices for the new RTX4000 today without the mining.

No evidence at all that many gamers were mining. Most just wanted a card to play games with.
 

amenx

Diamond Member
Dec 17, 2004
3,892
2,102
136
I have long since given up believing that people won't pay exorbitant pricing for GPUs. It wasn't just miners buying insanely priced GPUs during the last couple of years. Gamers were paying those prices as well.
Of course there are gamers who will pay these sort of prices, but not enough to move the sort of volumes Nvidia is committed to over the coming year.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
Of course there are gamers who will pay these sort of prices, but not enough to move the sort of volumes Nvidia is committed to over the coming year.

There is simply no way to know that. I wouldn't be surprised if scalping starts up again and people started paying scalped prices on top, even with GPU mining completely dead.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,056
136
www.teamjuchems.com
Of course there are gamers who will pay these sort of prices, but not enough to move the sort of volumes Nvidia is committed to over the coming year.

We don't really know how much allocation is needed for consumer GPUs vs their HPC/cloud efforts though, do we?

And it's not like they couldn't easily find someone needing that allocation if they would be allowed to resell/release it if they found a partner.

I can think of several ways nvidia could be fine with this, so long as they have healthy cashflow.
 

jpiniero

Lifer
Oct 1, 2010
14,584
5,204
136
Of course there are gamers who will pay these sort of prices, but not enough to move the sort of volumes Nvidia is committed to over the coming year.

There's going to be more products than these of course. Like there's also an "Quadro" announced with 48 GB of memory, available in December.

 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
Other slides make it looks like raster improvement could be >50%, but don't mention which games use frame insertion, only that it is applied "when applicable." It's clear that there isn't a 2x improvement here. Probably somewhere in the 1.33x-1.66x range given the large marketing BS induced error bars.

The first three games on the slide below, are NOT on the DLSS 3 game list so they can't be using frame interpolation to boost FPS:

4090 is about 1.5X 3090 Ti at rasterized performance without frame interpolation.

Almost certainly better than 1.5X with Ray Tracing.

CP 2077 is combing the RT gains, with the DLSS 3 frame interpolation "gains".