Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 46 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

maddie

Diamond Member
Jul 18, 2010
4,722
4,624
136
My apologies if this was already posted (it's new product season, so AT Forums is more active), but looks like Nvidia gave some more info regarding DLSS 3.0:

View attachment 67963

Latency sees a reduction from what Nvidia calls Native to with DLSS 3.0, but who knows if that's just coming from not enabling Reflex for Native and turning it on for DLSS 3.0 to make it look like you get a reduction. Also, where is DLSS 2 in this comparison? Those of us with Ampere won't be able to take advantage of DLSS 3 so I'd like to know how much of a benefit comes from DLSS 3.0 alone, specifically from the Frame Generation feature. If the DLSS 2 results look exactly like the DLSS 3 results except the frame rates are just simply half, then we all know what's going on: the higher frame rate comes from inserting those additional "fake" frames.

Obviously DLSS running at lower resolution, therefore latency will be low.
If you do the numbers you get ~ 70% of the fps the real 1440p rate (170/2).

59.9 (@ 1440p)= 0.7 x 170/2 (@ 1080p) and this corresponds to the reduction in latency (75.4 vs 53.5)

Some devious marketing here. You will get the same latency with DLSS 2, without the motion errors induced by sudden changes.

Its the perfect way to never lose the crown. Losing, then lets insert 2 frames for every one. Automatic 50% increase in fps.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
My apologies if this was already posted (it's new product season, so AT Forums is more active), but looks like Nvidia gave some more info regarding DLSS 3.0:

View attachment 67963

Latency sees a reduction from what Nvidia calls Native to with DLSS 3.0, but who knows if that's just coming from not enabling Reflex for Native and turning it on for DLSS 3.0 to make it look like you get a reduction. Also, where is DLSS 2 in this comparison? Those of us with Ampere won't be able to take advantage of DLSS 3 so I'd like to know how much of a benefit comes from DLSS 3.0 alone, specifically from the Frame Generation feature. If the DLSS 2 results look exactly like the DLSS 3 results except the frame rates are just simply half, then we all know what's going on: the higher frame rate comes from inserting those additional "fake" frames.

What a Garbage comparison from NVidia. DLSS 2 (and the same tech in DLSS 3) upscaling, will be responsible for the latency reduction as that will increase the actual frame rate by running at lower res.

The DLSS 3 fake frame insertion can't improve latency, it can only make it worse. Bundling them together hides that.

I really hope reviewers universally condemn fake frame insertion, so there is less chance NVidia can mislead people with it.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,792
5,749
136
I know LTT typically tries to keep companies happy (for obvious reasons), but this video really sounds like an apologists video. Sure he mentions how the 4080 12GB should probably have been a 4070, but he goes on in several cases about how we should be happy that the cards are as cheap as they are?!

That was really pathetic. Saying Nvidia's not being greedy, EVGA leaving the business is proof margins are tight. Yeah margins for AIBs sound tight. Not margins for Nvidia with the ridiculous price inflation of their cards ever since Turing.
 

dlerious

Golden Member
Mar 4, 2004
1,769
717
136
That was really pathetic. Saying Nvidia's not being greedy, EVGA leaving the business is proof margins are tight. Yeah margins for AIBs sound tight. Not margins for Nvidia with the ridiculous price inflation of their cards ever since Turing.
I wouldn't mind a little increase here and there, but it's like they're working their way down the stack, bumping things to tier above pricing.First it was the 80Ti models jumping to Titan prices, then the 80 models jumping to 80Ti, and now the renamed 4070 jumping 2 tiers in price. What's next, a 4060 next year for $550?
 
  • Wow
  • Like
Reactions: Stuka87 and Leeea

jpiniero

Lifer
Oct 1, 2010
14,509
5,159
136
I wouldn't mind a little increase here and there, but it's like they're working their way down the stack, bumping things to tier above pricing.First it was the 80Ti models jumping to Titan prices, then the 80 models jumping to 80Ti, and now the renamed 4070 jumping 2 tiers in price. What's next, a 4060 next year for $550?

I think there's a good chance AD106 and AD107 are mobile only. For sure you won't see them for some time unless crypto skyrockets.
 
  • Like
Reactions: Leeea

HurleyBird

Platinum Member
Apr 22, 2003
2,670
1,250
136
I can see it now, "DLSS4.0, we take an 8k native path traced image and with the magic of AI, turn it into a blurry artifact ridden mess and render it at 1000fps!"

DLSS 4 will probably be DLSS 3 done right, just as DLSS 2 was DLSS 1 done right. Regardless of current deficiencies virtual frame generation is the future and will be perfected eventually. Open question of whether Ada even gets DLSS 4 support given the precedent set by DLSS 3 though.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,743
734
136
Yes.
At the current prices it seems like the 4070 will be like 7-750 so a 550-600 4060 would be just about right on.

Given the huge gap between the 4080's they could be only minor rip off prices, $600-650 for the 4070, $400-500 for a 4060.
 

Saylick

Diamond Member
Sep 10, 2012
3,084
6,184
136
DLSS 4 will probably be DLSS 3 done right, just as DLSS 2 was DLSS 1 done right. Regardless of current deficiencies virtual frame generation is the future and will be perfected eventually. Open question of whether Ada even gets DLSS 4 support given the precedent set by DLSS 3 though.
At some point virtual frame generation will still be limited by the underlying base frame rate, no? Even if DLSS 4.0 was super duper good at frame generation and could project 9 virtual frames for every 1 key frame, your base frame rate needs to be sufficiently high enough or else your inputs won't get polled frequently enough. It's not as straight-forward as cranking up all the graphic settings, leading to a base frame rate of 10 fps, and then justifying it via frame generation to boost it to 100 fps. It will be a bad gameplay experience. The only way I can see them fix this is if DLSS also takes in user input in between rendered frames so that the motion vectors are constantly adjusted on the fly. This is basically what is already done in VR with timewarp techniques... Ah crap, knowing that this technique already exists but is not mass-marketed leads me to think this will be the direction they take, then everyone will think Nvidia invented it, like RT...
 

Saylick

Diamond Member
Sep 10, 2012
3,084
6,184
136
I found some extra Nvidia slides regarding DLSS 3... Looks like there is only a small reduction in latency going from DLSS 2 to DLSS 3, even with frame generation enabled, solely because Reflex is integrated into DLSS 3. If it weren't, I suspect the latency between DLSS 2 and DLSS 3 are identical:
1663892843467.png
1663892850306.png
1663892859442.png
1663892884826.png
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
I found some extra Nvidia slides regarding DLSS 3... Looks like there is only a small reduction in latency going from DLSS 2 to DLSS 3, even with frame generation enabled, solely because Reflex is integrated into DLSS 3. If it weren't, I suspect the latency between DLSS 2 and DLSS 3 are identical:
View attachment 67973
View attachment 67974
View attachment 67975
View attachment 67976

I actually expect latency to be lower with DLSS 2 + Reflex.

Won't be surprised if DLSS 3 games won't support DLSS 2 + Reflex as that will result in less latency, because inserting fake frames has to have some cost, and they are just hiding the cost by using reflex.

Everything about DLSS 3 seems like a misleading scam.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
You will actually get better latency with DLSS 2, since DLSS 3 does not grant a 100% speedup outside of certain fringe 100% CPU limited scenarios.

It doesn't really speed anything up because the added frames are just a non-existent frame inserted between two actual frames to speed things up. It will make it look less choppy if the frame rate is bad enough, but it won't improve the feel because the inserted frames aren't responsive to input.

Frankly they could generate an infinite number of these frames to give the illusion of any frame rate they wanted to as long as that overall cost doesn't shift the bottleneck back to the GPU. All they would need to do is just create two frames at even intervals between the two real frames instead of one.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,670
1,250
136
At some point virtual frame generation will still be limited by the underlying base frame rate, no?

No. It would need much tighter integration, but there's no reason you can't switch off just the rendering loop for virtual frames and leave everything else intact. Or, depending on your engine, you might run a subset of the game loop or store inputs in a buffer and reverse predict. And there are definitely other integration approaches I haven't thought of. Heck, there's no fundamental reason you can't completely decouple rendering, except that game engines don't tend to be built that way today. Might make more sense in a future where virtual frame insertion is the norm though.

When I think of "perfected" virtual frame insertion the first thing that comes to mind is a major reduction in artifacting and second, an increase to performance to the point where the cost to base framerate is light-to-negligible. It should be possible to insert an arbitrary number of virtual frames, and also to shut off frame generation entirely based on context. It should be possible to integrate loosely, where the game loop is shut down during virtual frames, or tightly, where the engine can still run during virtual frames but won't render.

This is the future.
 
Last edited:
  • Like
Reactions: Vattila

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
No. It would need much tighter integration, but there's no reason you can't switch off just the rendering loop for virtual frames and leave everything else intact. Or, depending on your engine, you might run a subset of the game loop or store inputs in a buffer and reverse predict. And there are definitely other integration approaches I haven't thought of. Heck, there's no fundamental reason you can't completely decouple rendering, except that game engines don't tend to be built that way today. Might make more sense in a future where virtual frame insertion is the norm though.

When I think of "perfected" virtual frame insertion the first thing that comes to mind is a major reduction in artifacting and second, an increase to performance to the point where the cost to base framerate is light-to-negligible. It should be possible to insert an arbitrary number of virtual frames, and also to shut off frame generation entirely based on context. It should be possible to integrate loosely, where the game loop is shut down during virtual frames, or tightly, where the engine can still run during virtual frames but won't render.

This is the future.


It's not the future. It sounds like you are drinking the NVidia Kool-aid.

This is glorified "Motion Smoothing" or "Frame Creation" like on practically all modern TVs.

An almost universally derided feature.

But even the sketchy TV makers don't pretend that you now get higher framerate, they just call it motion smoothing or , which is what this is.

Even if you reduce the artifacts, and cost, it's still mostly useless.

The reactivity can NEVER be better than the real base framerate.

I have no issue with them including a feature like this, but it should be called "Motion Smoothing" or something like that.

It's very disingenuous to sell this is a real frame rate like NVidia is doing.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,670
1,250
136
It's not the future. It sounds like you are drinking the NVidia Kool-aid.

Hardly. My expectations for DLSS 3 are very low, and I expect that it will mostly make sense only in extreme edge cases. Subsequent and better implementations of virtual frame insertion are where the future is, and I expect that eventually it will be ubiquitous between vendors the same way image reconstruction upscaling is becoming today.

The reactivity can NEVER be better than the real base framerate.

By thinking in terms of like "real base framerate" you're missing the picture altogether. Games today mostly run 1-to-1 logic and rendering, but it's not a fundamental constraint. The moment you decouple or run one at a multiple of the other the term loses its meaning. For example, if my game loop runs at 60 FPS while my render loop runs at 30 FPS, then what is the base framerate? Again, there's no reason you cannot run game logic during virtual frames with proper integration. No, DLSS 3 can't do it, but some future implementation of virtual frame insertion will.

It's very disingenuous to sell this is a real frame rate like NVidia is doing.

I'm not, and I agree. DLSS 3 in and of itself is probably not the future. But virtual frame insertion definitely is. Those two statements are not contradictory.
 
  • Like
Reactions: Vattila

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
By thinking in terms of like "real base framerate" you're missing the picture altogether. Games today mostly run 1-to-1 logic and rendering, but it's not a fundamental constraint.

It is a fundamental constraint. The whole point of what these systems are doing, are adding FAKE frames, that have nothing to do with the game logic.

Fake frames don't respond to user input.

They aren't adding anything, they just smooth things out at best.

You could literally add this functionality to the Monitor, just like it's in most TVs, which should make it clear, that the real base frame rate, is the only thing that matters.