• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 43 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MrTeal

Diamond Member
Dec 7, 2003
3,919
2,708
136
DLSS 3 looks very impressive tbh - complete interpolation from 2 key frames. Because it's completely out of the rendering pipeline it will just double the frame rate no matter what the bottleneck is, which you can see from things like MS flight simulator which is cpu bound. I await to see how good the quality is, but assuming it is decent then it's an amazing step forward.

IMO, it's more a fake FPS boost, for marketing purposes, than an actual benefit for gaming purposes.

If NVidia succeeds in getting reviewers to simply start quoting DLSS 3 frame rates like they were real, it's the (slimy) marketing coup of the decade.

I can't wait to see the Nvidia Reviewer's Guide and its suggestion that sites benchmark on an i3-10100, because 4C is typical of the average rig according to Steam. Those DLSS3 numbers will really pop.
 

jpiniero

Lifer
Oct 1, 2010
16,840
7,284
136
Gamers have been struggling to find an affordable card for the past two years, they're tired and cranky waiting for a viable upgrade,

They can though. It's called the 30 Series.

Note that I doubt that the 4080s pricing is such to encourage people to buy 30 series for the time being. But I'm sure they wouldn't mind.
 
  • Like
Reactions: Leeea

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
What are we expecting real world, non-DLSS, non-RT performance wise? I am guessing 1.4x of 3090ti for 4090? Nvidia's numbers are generally always inflated. Prices to me speak to nvidia being in a bad spot right now with their current financials, stuck holding the bag with a ton of 30 series stock still out there and wanting to keep the gravy train prices going of the mining craze, without the mining craze. Not surprising as it's nvidia.

I'll probably get a 4080 16GB regardless, but want to get at least a 40% performance uplift in the large majority of my 4K gaming over my 3080; which is non-DLSS, non-RT, standard raster gaming.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I'll probably get a 4080 16GB regardless, but want to get at least a 40% performance uplift in the large majority of my 4K gaming over my 3080; which is non-DLSS, non-RT, standard raster gaming.

Don't hold your breath. nVidia is being sheepish around rasterization performance. And the 4080 16GB is significantly more expensive than your 3080 was.
 
  • Like
Reactions: Tlh97 and ozzy702

fleshconsumed

Diamond Member
Feb 21, 2002
6,486
2,363
136
Freaking ethereum. Showed Nvidia they could deliver us huge price hikes in Turing and now Lovelace. Never thought I'd see a worse card than the 2080 but JFC at the 4080 12GB.
Except that's not true. With ETH going PoS the 30 series sales are floundering. Turing didn't exactly sell very well, and Ampere real world pricing was largely due to mining. I don't see Ada selling very well either, not in the current market conditions without mining demand.

The "mindshare" is ridiculous considering a 3050 costs the same or more than an RX 6600 that will beat the snot out of it. That is the problem.
True, but frankly I was a bit astonished by the blowback in the past 24 hours, it's pretty much universal. I think nvidia is playing a dangerous game right now. You have to remember that a lot of nvidia gamers have been sitting out on the sidelines for the past 6 years waiting for a good replacement for their 10 series. The 10 series provided good value for the money at the release time. Turing was faster, but it was also proportionately more expensive so a lot of nvidia gamers decided to wait till Ampere. 30 series was supposed to bring performance/$ value back to sanity, and it did, the 30 series were priced right, but then the mining boom happened and you couldn't buy 30 series unless you were willing to pay 2-3 times MSRP.

Again, there are a lot of nvidia gamers who have been waiting on the sidelines for up to 6 years to get a good upgrade, they're tired and frustrated, and Ada pricing is yet another blow to them. Clearly nvidia is convinced it has the clout to keep punching people in the nuts. Time will tell, but I think nvidia may be overplaying their hand more than they realize. There are a lot of very upset nvidia gamers right now, at this point there is a long history of anger built up over the past 6 years, and Ada pricing is doing nothing to diffuse that anger.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
8,347
9,730
136
I think, like all proprietary NV tech, DLSS 3 has its place. It will be absolutely GREAT for any game that isn't "twitchy", like MS Flight Sim and some racing games, and basically any TBS games. Slow, panning scenes will give the AI plenty of data to interpolate frames cleanly and the end user won't really feel any difference in input because inputs tend to be more gradual.

The tech will struggle the most in FPS games (esp competitive ones), 3rd person brawlers & hack n Slash where timing is everything, and RTS games and the like.

I suspect after some additional refinements the gap will narrow a bit, but will still be there.

My opinion on DLSS/FSR/etc remains unchanged: A great way to wring more life out of an old card, but a damn shame if you have to use them on your brand new $1600 purchase.

Edit: RE AMD: AMD isn't going to save anyone from heinous pricing. I think AMD knows they cannot command the outrageous premium the 3090 provides, but its not like the 7xxx series doesn't have its own expenses attached to it (complex packaging, for example). Lisa Su has also discovered the time honored secret to running a successful business: YOU NEED TO MAKE MONEY, MORE SPECIFICALLY PROFITS.

I suspect AMD will price one tier down from NV, but beat them at each tier. $1200 AMD will outperform 4080/16 handily (and possibly scrape with the 4090), $800 AMD will beat 4080/12, etc etc etc. AMD is also in a better position in terms of overstock simply thanks to producing way fewer GPUs than NV (and those GPUs likely having much better margins on account of their smaller die sizes and less exotic memory). If AMD really want's to crap on NV's parade, releasing the N33 at any price lower than $900 will potentially complicate things for NV and its mountain of 3xxx cards.
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
16,840
7,284
136
I don't necessarily trust that TPU has the right numbers... but according to them the 4090 has:

2x the FP32 performance of the 3090 Ti
2.3x the pixel fill rate
2.06x the textel flil rate

And it's only 60% faster in raster games? That's what I mean about the memory bandwidth being that big of an issue, since the memory bandwidth is the same.

Edit: the 4080 16 GB is however only:
23% faster FP32
15% more pixel fill rate
21% more textel fill rate

Again there's no point in going crazy with clock speed if the memory bandwidth isn't there.
 
Last edited:

dlerious

Platinum Member
Mar 4, 2004
2,122
934
136
Im willing to bet from now , that they will match price the NVIDIA ADA and we will see the lower end SKUs only by next summer . :p:D
I have a feeling you're probably right. They have a chance to pick up market share and mindshare if they release a 7800XT that beats the 3090Ti at $700 - that would also affect Nvidias' ability to move their over supply of 30 series cards at current prices. Their AIB partners might not like that though.
 

Saylick

Diamond Member
Sep 10, 2012
4,060
9,485
136
I don't necessarily trust that TPU has the right numbers... but according to them the 4090 has:

2x the FP32 performance of the 3090 Ti
2.3x the pixel fill rate
2.06x the textel flil rate

And it's only 60% faster in raster games? That's what I mean about the memory bandwidth being that big of an issue, since the memory bandwidth is the same.

Edit: the 4080 16 GB is however only:
23% faster FP32
15% more pixel fill rate
21% more textel fill rate

Again there's no point in going crazy with clock speed if the memory bandwidth isn't there.
Yeah, it's really strange that the scaling isn't there, which debunks all of the Ada Lovelace Twitter leakers. FWIW, those leaks were based on TSE scores, which in theory should reflect gaming workloads, but perhaps there's a disconnect this time around. We'll just have to wait for proper reviews to know at this point. I won't discount Lovelace just yet...

Edit: Speaking of memory bandwidth, I've been trying to see if there's any evidence that Nvidia implemented the large 96 MB L2 cache and there's little to no hard evidence at all it exists, besides what popped up in the Nvidia leak earlier this year. There's a virtual debrief that Nvidia hosted for Lovelace and even during that meeting they would not confirm the existence of a much larger L2. It's strange. You'd think that if it exists, they'd brag about it.
 
Last edited:

dlerious

Platinum Member
Mar 4, 2004
2,122
934
136
Yes very much. Outside of niche needs like high resolution VR headsets and flight sims, the whole thing is feeling pretty deflated.
I play mostly single player Strategy/Adventure/RPG games, so I'm more toward resolution than FPS. Maybe I'm in a niche of a niche.
 

maddie

Diamond Member
Jul 18, 2010
5,157
5,545
136
That's the problem. If you were just passively watching 60 FPS or 120 FPS you mostly likely couldn't tell much difference, and you really wouldn't care about any minor difference you struggled to see.

Where the difference really matters, is in the feel, and reactivity to your inputs. When you start flicking your gun around to a target and 120 FPS feels vastly faster, and able to track you.

DLSS 3 frame fakery might superficially look like 120 FPS, if you were just passively watching, which really doesn't even matter, but it will utterly fail to feel like 120 FPS, which is exactly where it does matter. DLSS 3 120 FPS will still feel like 60 FPS.

IMO, it's more a fake FPS boost, for marketing purposes, than an actual benefit for gaming purposes.

If NVidia succeeds in getting reviewers to simply start quoting DLSS 3 frame rates like they were real, it's the (slimy) marketing coup of the decade.
A little more thinking about the implications of a projected frame, if correct, predicts what we will see. After all, the proof of a good explanation is in its ability to well, predict.

This might be akin to speculative execution, where you take a pathway, when if correct pays off spectacularly, but when wrong adds a penalty. In this case, everytime you change your view or position in a manner that breaks from your immediate previous movement will cause a glitch. In fact I think it will amplify the change as you will have a projected artificial frame further along the path you were moving before your change in position/view. When this corrects to your new values, you should perceive a bigger than normal shift. Shooters should be very interesting to watch.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
8,958
7,667
136
Except that's not true. With ETH going PoS the 30 series sales are floundering. Turing didn't exactly sell very well, and Ampere real world pricing was largely due to mining. I don't see Ada selling very well either, not in the current market conditions without mining demand.

Sure it is. Nvidia found out gamers would still pay exorbitant miner inflated prices from the ether boom of 2018 and that's how we got the $1200 2080 Ti and the $800 2080 that was no better than the 1080 Ti it was supposed to replace.
 

exquisitechar

Senior member
Apr 18, 2017
723
1,022
136
Sometimes pictures speaks larger than words..
View attachment 67898
View attachment 67899
View attachment 67900

*edit*
Disclaimer: dont know if real or not.. some people say its fake
*edit2*
Not fake, found digital foundry shill video here
Currently, I see little value in DLSS 3. It's an interesting piece of tech, but I can only imagine it making sense to use on the low end Ada cards. 4090 already has excellent performance in all but the most performance intensive games with RT maxed out, and you can use DLSS 2.x in that case. DLSS 3 gives you more frames, but does anyone want to make such compromises in terms of image quality with a $1600 GPU? Maybe I'm wrong, but it looks like a clear step down from DLSS 2.x's image quality and that's not exactly surprising.
 
  • Like
Reactions: scineram

Revolution 11

Senior member
Jun 2, 2011
952
79
91
The "mindshare" is ridiculous considering a 3050 costs the same or more than an RX 6600 that will beat the snot out of it. That is the problem.
Why are you complaining? That just means the RX 6600 price is lower than it should be, it is a good deal for those in the know.
 

beginner99

Diamond Member
Jun 2, 2009
5,318
1,763
136
I still don't get why people think AMD will go cheap.

I didn't say cheap, just cheaper and as others have said there is huge room to be cheaper than NV while not being cheap.
The 4090 is $1600. If 7900xt matches it and AMD sells it for $1200 it is not cheap but still several hundred dollars cheaper which means NV AIBs will have to take such a loss.

I agree. AMD has no reason to go too cheap as they probably could not satisify the demand if they undercut NV, especially at the 4070, em 4080 12GB, level. AMD makes more profit selling CPUs which use the same wafers.

But again with NVs pricing AMD can undercut them significantly and still be "not cheap".
 
  • Like
Reactions: Tlh97

maddie

Diamond Member
Jul 18, 2010
5,157
5,545
136
Sure it is. Nvidia found out gamers would still pay exorbitant miner inflated prices from the ether boom of 2018 and that's how we got the $1200 2080 Ti and the $800 2080 that was no better than the 1080 Ti it was supposed to replace.
This I find misleading. Sure, some gamers payed high prices, but I suggest, even more didn't. Why do we have so many waiting for prices to drop? If Nvidia or AMD think they can sell the normal amount of cards at inflated prices, they are delusional.

This happened even before we entered this period of economic distress.

As regards the claim that AMD won't charge much less even if they can, the simple answer is TAM. They have ~ 20% marketshare, that is resistant to expansion in a closely priced market. How do you change mindshare? Excessive value for money, which is exactly what they did with the original Zen. In this case its even harder as CPU don't have that many unique features as GPUs. What held them back was limited production capacity, which has now been resolved. We'll see by H2 2023.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
How many frames can DLSS3.0 insert from old data, if it can just keep rendering frames without waiting for the CPU at all, it will be funny seeing 1080p benchmarks with DLSS3.0 on something that's easy to render but is strongly CPU bottlenecked, you could see some stupid high frame rates from tons of inserted frames.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
Hopefully with the new packaging for RDNA3, that moving forward they will disrupt in a similar way to Ryzen and we can see some positive change for users
 

Thunder 57

Diamond Member
Aug 19, 2007
4,080
6,807
136
Why are you complaining? That just means the RX 6600 price is lower than it should be, it is a good deal for those in the know.

The problem is with NVIDIA "mindshare" people say ATI/AMD sucks and you must buy NVIDIA. Even the RTX 2060 is a better value than the 3050, yet it still is selling at higher prices.
 

Revolution 11

Senior member
Jun 2, 2011
952
79
91
Surely you mean that the 3050 is more expensive than it should be.
Well yes, it is more expensive than it should be, but the converse is that the 6600 is cheaper than it should be based strictly on performance level in the current market.

f the fan mindshare favors one company just because of the brand, I will gladly forego paying the brand premium and get more hardware for less money. Zen 1, Zen+, and Zen 2 were great deals because the fanbase had not fully realized the power/value of AMD CPUs and was putting more of a brand premium on Intel from its past achievements than Intel actually earned at the time.


The problem is with NVIDIA "mindshare" people say ATI/AMD sucks and you must buy NVIDIA. Even the RTX 2060 is a better value than the 3050, yet it still is selling at higher prices.
Mindshare to me can be backed on a solid foundation of fact or hollow and based on past reputation. If Nvidia mindshare is justified, then there is no issue for anyone, right? Everyone is paying more for a superior product.

If Nvidia mindshare is not justified or is too positive for the actual reality, just buy AMD products then and benefit from other people's foolish fanboyism.

I don't get the problems around mindshare as an actual consumer. Unless people are complaining as an AMD or Nvidia shareholder....
 
Last edited:

Thunder 57

Diamond Member
Aug 19, 2007
4,080
6,807
136
Well yes, it is more expensive than it should be, but the converse is that the 6600 is cheaper than it should be based strictly on performance level in the current market.

f the fan mindshare favors one company just because of the brand, I will gladly forego paying the brand premium and get more hardware for less money. Zen 1, Zen+, and Zen 2 were great deals because the fanbase had not fully realized the power/value of AMD CPUs and was putting more of a brand premium on Intel from its past achievements than Intel actually earned at the time.

One of the simplest things AMD could have done is require that the Xbox/PS put a sticker on it that said something like "Powered by AMD" along with their logo.
 

biostud

Lifer
Feb 27, 2003
19,934
7,039
136
I think, like all proprietary NV tech, DLSS 3 has its place. It will be absolutely GREAT for any game that isn't "twitchy", like MS Flight Sim and some racing games, and basically any TBS games. Slow, panning scenes will give the AI plenty of data to interpolate frames cleanly and the end user won't really feel any difference in input because inputs tend to be more gradual.

Because slow games really benefits from those high fps, while fast paced games not really need them :p
 

Mopetar

Diamond Member
Jan 31, 2011
8,496
7,753
136
The problem is with NVIDIA "mindshare" people say ATI/AMD sucks and you must buy NVIDIA. Even the RTX 2060 is a better value than the 3050, yet it still is selling at higher prices.

Hard to argue that if AMD performance is basically identical or better in some cases. If AMD "sucks" but still comes out on top what does that say about NVidia? There aren't too many people who can handle that much cognitive dissonance.

At the end of the day it's their money though and if them ignoring a brand means more supply or cheaper cards for me, I'm not going to argue, especially after the last few years.