Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 41 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
Then drop the mic, not the 7900XT.

Oh what the heck, smash it on the stage and tell us you’ve literally got boatloads of them ready to go, we don’t need that one.

There's precedent for that too actually. See the short video below about "reviewing" the iPad (a/v out of sync unfortunately):

 

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
LOL. C'mon Lisa Su. All you gotta do is just walk up on stage, hold up the 7900XT, and say the magical words, "Nine ninety nine".

You say that jokingly (maybe), but damn wouldn't that be something if she did that? That would win over some minds for sure.
 

Saylick

Diamond Member
Sep 10, 2012
3,162
6,387
136
You say that jokingly (maybe), but damn wouldn't that be something if she did that? That would win over some minds for sure.
Assuming the 7900XT gets within 10% of the raster performance of the 4090 at 350W or less and costs $999, that's a day one purchase for me. Instantly. I don't even care so much as to RT performance, which shouldn't even be an issue considering the 7900XT should have noticeably better RT performance than the 3090 Ti, but raster and power comes first for me. I will definitely underclock it by 15% to reduce power by 40% so that I can get essentially 1.5x 3090 Ti performance at <250W.
 
  • Like
Reactions: Tlh97 and SMU_Pony

GodisanAtheist

Diamond Member
Nov 16, 2006
6,815
7,171
136
Then drop the mic, not the 7900XT.

Oh what the heck, smash it on the stage and tell us you’ve literally got boatloads of them ready to go, we don’t need that one.

-Go all the way. Pull back a curtain. 7900xt running a game at 1000 Freedoms Per Second, "This for $999!" Crowd goes insane, pulls card from system and holds it up for everyone to see "In my hands is the next generation of graphics, the 7900xt" crowd screams louder, people at the front are crushed by the waves of humanity, butter fingers Lisa drops the card, fans screech, panicked in their confusion several dive to offer their bodies as cushioning to the falling card, it smashes on the stage, surprise, wood screws it's not even real, AMD fans begin having convultions and writhing on the floor as their entire beings are overwhelmed by peak meming from Pantsuit Hsu.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
God would I love to see another RX 480 kind of release. That was really something at $200.

I remember when those were hard to get. I got one as soon as I could for $239, the 8GB version. What a card. It aged better than the 1060 6GB. And it was faster in Doom and Battlefield, the two most intensive games I was playing at the time.

Every so often we used to get cards like these. Like the 8800GT. Legendary. Unfortunately those days seem to be over.
 
  • Like
Reactions: Tlh97 and ZGR

rommelrommel

Diamond Member
Dec 7, 2002
4,382
3,111
146
Just my thoughts, but the pricing seems really wack.

4080 12G is $899 for 7680 CUDA cores
4080 16G is $1199 for 9728 CUDA cores
4090 is $1499 for 16385 CUDA cores

That makes a 4080 16G 1.27 times the cores of a 4080 12G for 1.33 times the cost.

That makes a 4090 2.19 times the cores of a 4080 12G for 1.67 times the cost.

That makes a 4090 1.68 times the cores of a 4080 16G for 1.25 times the cost.

Plus more memory and a wider bus as you move up, albeit a bit lower clocks.

How did the flagship card become much better value than the second tier card? And arguably a reasonable value relative to the entry level 40 series for now. Assuming you’re playing in that price range, but shit a flagship AIB 4080 16G will be right on the heels of a 4090 FE.

For the performance the 16G should just be the 4080 and be priced at $899. The 12G should be a 4070/4070TI and priced at $599. $1199 should be the eventual 4080ti.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,893
5,825
136
There's precedent for that too actually. See the short video below about "reviewing" the iPad (a/v out of sync unfortunately):


If I was rich I'd pay 30 people to wait in line at Best Buy to buy a 4090 at launch and then just smash them into the ground one at a time in front of the store.
 
  • Haha
Reactions: blckgrffn

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
My last two GPU were legendary. ATI Radeon 9700 Pro, followed by an 8800GT, which I'm still using!

I had both as well. Well, maybe it was a 9800 Pro it was some time ago so I'm not certain. The 8800GT I literally used until it started dying.
 
  • Like
Reactions: Leeea

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Yeah, I was looking over his replies on Twitter to see if he was just giving the Nvidia marketing spiel or if he truly believes what he says, and it seemed to lean towards the former. Too much use of marketing terms for me to think otherwise.

Regardless, "illusion of higher fps" is an apt way to put it. All I know is this, moving forward: Not all fps is created equal. Will there be scenarios where it is preferable to have 40 fps without DLSS 3.0 over 80 fps with DLSS 3.0 at the same input lag? The selling point, if done right, is that there would be zero cost to enabling DLSS 3.0. We'll have to see in-depth testing to find out...
The downside of DLSS is decreased quality. All the testing has been with DLSS performance which is basically 1080p upscaled. Plus I don't see how having a frame inserted that has no CPU generated game data attached to it cannot be perceived as lag if it occurs during a button press. I don't think this technology will be particularly desirable in all cases. Especially if it interprets the frame incorrectly for some reason.

An Nvidia employee on reddit said that DLSS 3 does benefit the 3000 series cards but they cannot use all the combined features together. Any benefits to scaling quality or performance in that area will be seen by 3000 series owners. Just none of the frame generation stuff.

I for one find it pretty telling though that all the testing I've seen has been using DLSS performance with frame generation and the cyberpunk testing was done with a beta build that uses some special overloaded ray tracing setting. I can totally see myself switching from psycho to this new ray tracing and seeing zero difference except the frame rate hit. From my view this is all done to brush aside the rasterization performance increase over previous gen which I think is probably going to be lower than they want to admit. Will wait for independent testing of course.
 
Last edited:
  • Like
Reactions: Saylick

Frenetic Pony

Senior member
May 1, 2012
218
179
116
I can go buy a 3080ti, about the same performance as their "4080 12gb" according to the tiny handful of titles they've even published "graphs" for, sans resolution, for $640 on ebay. That's $260 off the price.

Damn, what tough choice.
 
  • Like
Reactions: Aapje and NTMBK

HurleyBird

Platinum Member
Apr 22, 2003
2,684
1,268
136
Plus I don't see how having a frame inserted that has no CPU generated game data attached to it cannot be perceived as lag if it occurs during a button press. I don't think this technology will be particularly desirable in all cases. Especially if it interprets the frame incorrectly for some reason.

The following is my educated guess about how latency is impacted:

My first assumption is that every other frame is virtual. Eg. it is not possible to have two base frames in a row, or two virtual frames in a row. If this assumption is not correct, then the manner in which virtual frames are inserted would determine the impact to latency.

My second assumption is that the mechanism of adding a virtual frame in and of itself does not increase latency since, unless Nvidia engineers are lying on twitter, virtual frames are not interpolated but are rather forward projected.

Virtual frames do not help with (input) latency of course. They are totally disconnected from the game update loop. If every other frame is virtual, the engine is effectively running at half rate.

Based on the fact that frame insertion does not yield a 100% speedup, it stands to reason that, at least if my first assumption is correct, there is a performance hit to "base framerate," which would in turn be the main driver of added latency.

My question is what happens when you have a capped frame rate. If I run a game with a 60 FPS cap and have enough power to hit 60 FPS without frame insertion, will DLSS 3 insert new frames anyway? In other words, am I stuck with a "base framerate" of only 30 FPS? On a laptop or power constrained environment, this might be preferable behavior, but not on a desktop.

Ideally DLSS 3 would have a "prefer latency" setting and a "prefer efficiency" setting for how to behave when running up against a framerate cap.

All in all, I wouldn't be surprised if DLSS 3 ended up a lot like DLSS 1 in being a great idea that just needed more time in the oven. Of course it's possible DLSS 3 might be great from the get-go, but if it isn't, I'm sure Nvidia will have it figured out for DLSS 4. Virtual frame insertion is the future, obviously, and in time all vendors will have their own solutions.
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
Buy the 4090, have to upgrade PSU to 1,200+ watts!

Yeah not only are these cards way, way overpriced, they also use too much power. I have 700w so only the 4070 would work and with that i mean the 4080 12gb but yeah it's a 4070. On top of that they are in typical NV fashion again anemic in terms of memory.

And here we all thought the Samsung process was garbage. I expected a huge performance/watt jump for NV but doesn't look like it at all. It is also funny that NV says graphic card power but you need double that in terms of PSU which tells you there will again be huge power spikes likley into the 600w range for the 4090. maybe they can once again be underclocked and undervolted with minimal performance loss?
 

biostud

Lifer
Feb 27, 2003
18,251
4,764
136
The 4090 holds nearly three times as many transistors than the 3090 but delivers a little more than double the performance in normal raster situations. Is that because most of the extra transistors have been used to beef up RT and tensor cores?
 
  • Like
Reactions: Leeea

biostud

Lifer
Feb 27, 2003
18,251
4,764
136
I agree I run a pcie-4 card in a gen 3 motherboard. But if your spending that kind of money you would expect the best. Not to mention new specs like Display Port 2.0 looks like to me the safe money is to wait.

Pick up a 3 series at a supper discount use FSR to upscale if needed and wait it out.

R81Z3N1
But we all know if you spend that kind of money, you are in two years going to buy the 5090 for $2000 as well with DP2.0 and PCie 5.0 :p
 

marcUK2

Member
Sep 23, 2019
74
39
61
Price wont hold. There is no Ethereum demand this time around. There is horrible recession coming. RDNA3 will hopefully be competitive and on a (possibly) cheaper node. Putin mobilizes 300k and huge war brewing.
Only positive I can see is that some fools will think scalping will be a thing this time around, and they will get scalped.
I'm expecting huge discounts after 2023 new year when few are buying.
I will buy 4090 when its <$1000
 

marcUK2

Member
Sep 23, 2019
74
39
61
I also think that if you read the small print of Nvidias slides, the upto 4x performance claim, and general-ish 2x claim is only achieved when using DLSS3, which I dont see enabled on the comparison card - will be interesting to see an apples to apples comparison, because seems to me that tripling the transistor count hasnt done a huge benefit to performance relatively speaking.
 
  • Like
Reactions: Ranulf

Ranulf

Platinum Member
Jul 18, 2001
2,350
1,172
136
I also think that if you read the small print of Nvidias slides, the upto 4x performance claim, and general-ish 2x claim is only achieved when using DLSS3, which I dont see enabled on the comparison card - will be interesting to see an apples to apples comparison, because seems to me that tripling the transistor count hasnt done a huge benefit to performance relatively speaking.

The slides and the DLSS3 video is typical marketing sillyness. Who runs RT on without DLSS? Someone at 1080p with a 2080ti, maybe a 3070 or better? Why not show DLSS 2.0 3080 vs 3.0 on the 40 series card? Why does the lower left side say RTX is off but under the 21fps say RT is On?
 
  • Like
Reactions: Tlh97 and Leeea

gorobei

Diamond Member
Jan 7, 2007
3,669
997
136
have we seen die size estimates yet or can we make assumptions from cudacore counts?

this feels like another small die pretending to be a cut down big die naming masquerade like the 3070, only worse since it is pretending to be the xx80.
 
  • Like
Reactions: Tlh97 and Leeea