Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 45 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Sometimes pictures speaks larger than words..



*edit*
Disclaimer: dont know if real or not.. some people say its fake
*edit2*
Not fake, found digital foundry shill video here
If you compare native to DLSS 2 Performance you get similar results (bluring). If you compare native to DLSS 2 Quality the image quality is about the same (no bluring). Hence a better comparison vs native would be DLSS 3 using Quality not Performance.

Someone on guru3d nicely added this:
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I know LTT typically tries to keep companies happy (for obvious reasons), but this video really sounds like an apologists video. Sure he mentions how the 4080 12GB should probably have been a 4070, but he goes on in several cases about how we should be happy that the cards are as cheap as they are?!
 
  • Like
Reactions: Cableman

Saylick

Diamond Member
Sep 10, 2012
3,172
6,407
136
Gamer's Nexus has a rather scary and amusing video on how each of the Nvidia partners are marketing Ada Lovelace. Scary because all of the 4090 designs are just ludicrously chonky, and amusing because some brands have literally used wildly exaggerated marketing language, such as saying the user will have "a new level of brilliance and absolute dark power" and all of the GPU brackets are said to be "anti-gravity design". LOL. I don't know about you guys, but it seems the only "dark power" I'll get is when my homes lights shut off due to the computer tripping the breaker. It'll definitely be dark then.

 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Gamer's Nexus has a rather scary and amusing video on how each of the Nvidia partners are marketing Ada Lovelace. Scary because all of the 4090 designs are just ludicrously chonky, and amusing because some brands have literally used wildly exaggerated marketing language, such as saying the user will have "a new level of brilliance and absolute dark power" and all of the GPU brackets are said to be "anti-gravity design". LOL. I don't know about you guys, but it seems the only "dark power" I'll get is when my homes lights shut off due to the computer tripping the breaker. It'll definitely be dark then.


Yeah, it was posted earlier already in this thread: https://forums.anandtech.com/thread...rchitecture-speculation.2588950/post-40849877

But yes, it has some great engrish in it.
 
  • Like
Reactions: Saylick

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
Sometimes pictures speaks larger than words..
*edit*
Disclaimer: dont know if real or not.. some people say its fake
*edit2*
Not fake, found digital foundry shill video here

Download the video and do a frame by frame and you can see that if there is any challenge at all, the inserted frames have many glitches. Full res frame I captured below.

IMO, this is just like what the "Motion Smoothing" setting on some TVs do. When it's firing here, just about ever second frame is fake, and if there is anything challenging, you can clearly see the glitches on the fake frames.

In no way should this inserted fake frames be equated with a real frame rate.

It won't react the way a real high FPS frame rate would, and they are clearly interpolated and full of interpolated arguments.

That's not to say it's of ZERO value, there may be some smoothing effect in some games, but for the most part I think the value is negligible and NV is being very misleading reporting these frame rates as if they are real.

IMO, it's little better than playing a 60 FPS game on a TV with motion smoothing, and claiming that is 120 FPS.

Coming Soon.jpg
 

Revolution 11

Senior member
Jun 2, 2011
952
79
91
Gamer's Nexus has a rather scary and amusing video on how each of the Nvidia partners are marketing Ada Lovelace. Scary because all of the 4090 designs are just ludicrously chonky, and amusing because some brands have literally used wildly exaggerated marketing language, such as saying the user will have "a new level of brilliance and absolute dark power" and all of the GPU brackets are said to be "anti-gravity design". LOL. I don't know about you guys, but it seems the only "dark power" I'll get is when my homes lights shut off due to the computer tripping the breaker. It'll definitely be dark then.

Someone tell CERN about these new anti-gravity brackets! I am sure the next market for GPUs will be particle beam steering in the Future Circular Collider (FCC aka LHC 2.0).
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,483
2,352
136
Nvidia partners offer their RTX 4000 series cards, with some absurd examples.

Damn, we're up to 4 slot coolers now.... Screw that. Seems like we're going in cycles between efficient and balls to the wall bloated designs. Personally I'd gladly give up 10% performance if it meant 30% reduction in heat and noise.
 
  • Like
Reactions: Leeea

Tup3x

Senior member
Dec 31, 2016
965
951
136
Not sure what you are talking about. You can force anisotropic filtering in AMD drivers.
You should probably ready what it actually does (or doesn't do).

It literally says that it only works in <=DX9 games. Maybe in OpenGL too.
 

Tigerick

Senior member
Apr 1, 2022
663
540
106
You'll just have to believe me. I've been reading reviews since the late 90's and with big reviews I'd do analysis on them to see scaling and everything.

Approach it logically. A balanced system and game would take advantage of each of the major features equally. 20-30% gains can be had by doubling memory bandwidth. Same with fillrate, and same with shader firepower.

If one generation changes the balance too much, that means there's optimization to be had. Imagine a game/video card where doubling memory bandwidth increased performance by 60-80%. Then you know somewhere in the game and/or GPU, there's a serious bottleneck when it comes to memory. This also means you are wasting resources by having too much shader and fillrate, because you could have had much better perf/$, perf/mm and perf/W ratio.

That's why it's called a Rule of Thumb though. All sorts of real-world analysis has to be done to get the actual value. Game development changes, and quality of the management for the GPU team changes.



There's no such thing as free. To get more performant shader units, generally you need more resources. Optimization is what you need to do better than that, but that requires innovation, ideas and time which doesn't always happen.
Just wonder how much effect of PCIe Gen5 on graphics card. RX7000 seems supporting Gen5 which means doubling the bandwidth compared to Gen4, will it improve overall frame rate?
 

Tup3x

Senior member
Dec 31, 2016
965
951
136
The same is true for Nvidia.
Not true (of course, there can be games where it doesn't work but in general it does work). It can even work in some DX12 games (used to use it in Horizon Zero Dawn when in game AF was broken - it had small issues but it was well worth it). It probably doesn't work in Vulkan games.

What I'd like to know: How does Intel handle that and do they have NVIDIA like v-sync settings?
 
Last edited:

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
Not true (of course, there can be games where it doesn't work but in general it does work). It can even work in some DX12 games (used to use it in Horizon Zero Dawn when in game AF was broken - it had small issues but it was well worth it). It probably doesn't work in Vulkan games.

What I'd like to know: How does Intel handle that and do they have NVIDIA like v-sync settings?
If the global setting works in Nvidia it will work in AMD and vice versa.

It has to do with how the direct x flag is handled, and Nvidia does not have any special magic sauce to change that.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
NVidia responding to price complaints:

Falling GPU prices are thing of "the past", Nvidia says


Many had hoped for lower prices - but, according to Nvidia boss Jensen Huang, these costs reflect a world where Moore's law - where performance is doubled for half the price every two years - is now "dead".
 

Tup3x

Senior member
Dec 31, 2016
965
951
136
If the global setting works in Nvidia it will work in AMD and vice versa.

It has to do with how the direct x flag is handled, and Nvidia does not have any special magic sauce to change that.
Well, show me the proof then otherwise I'm inclined to believe what the driver tells me. I don't have AMD card unlike you (well, I do have Ryzen APU as work laptop but I'm not going to try running games on it).
 

Mopetar

Diamond Member
Jan 31, 2011
7,848
6,015
136
I will strongly disagree. There are a lot of people who can handle a lot of cognitive dissonance by creating alternative facts and selectively accepting information that preserves their preexisting beliefs. Just look at religion, politics, etc, etc. It is all over the place and it always has been.

Politics, religion, etc. are nebulous or abstract enough or exist in a world of promises of a far off future where you don't have to confront reality.

If one card has an extra 10 FPS across a series of 30 games, it's really hard to argue about which is better. At that point it just devolves into a debate on value per dollar, power use, or something else.
 

Mopetar

Diamond Member
Jan 31, 2011
7,848
6,015
136
It will be good for their 4050 or even 4030 part, especially the 1000+ laptop models released with mobile 4050 :D

I don't think that's the case. It sounds like DLSS3 needs to have the next frame finished in order to generate a fake to insert between them. That works fine if you're CPU bound and the GPU is being held back, which is what will happen on something like a 4090.

On a 4030 or a 4050, it's the GPU that's more likely to be the bottleneck.
 

Saylick

Diamond Member
Sep 10, 2012
3,172
6,407
136
My apologies if this was already posted (it's new product season, so AT Forums is more active), but looks like Nvidia gave some more info regarding DLSS 3.0:

1663868892987.png

Latency sees a reduction from what Nvidia calls Native to with DLSS 3.0, but who knows if that's just coming from not enabling Reflex for Native and turning it on for DLSS 3.0 to make it look like you get a reduction. Also, where is DLSS 2 in this comparison? Those of us with Ampere won't be able to take advantage of DLSS 3 so I'd like to know how much of a benefit comes from DLSS 3.0 alone, specifically from the Frame Generation feature. If the DLSS 2 results look exactly like the DLSS 3 results except the frame rates are just simply half, then we all know what's going on: the higher frame rate comes from inserting those additional "fake" frames.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,899
5,834
136
NVidia responding to price complaints:

Falling GPU prices are thing of "the past", Nvidia says


If that's the case I can just stick with my 1660 Super until it dies since if gpu prices are going to grow linearly with performance there is no reason for studios to chase better graphics in games since people aren't going to buy $1000 consoles for AAA gaming.
 

Ranulf

Platinum Member
Jul 18, 2001
2,356
1,177
136
I know LTT typically tries to keep companies happy (for obvious reasons), but this video really sounds like an apologists video. Sure he mentions how the 4080 12GB should probably have been a 4070, but he goes on in several cases about how we should be happy that the cards are as cheap as they are?!

Heh, at 1min 38s he says "its still a lot of power but not an ***-inine amount of power". Yeah, sure you betcha. This guy below must've gone to the LTT school of hardware handling.

4090itxMOBOfall.gif
 

JustViewing

Member
Aug 17, 2022
135
232
76
My apologies if this was already posted (it's new product season, so AT Forums is more active), but looks like Nvidia gave some more info regarding DLSS 3.0:

View attachment 67963

Latency sees a reduction from what Nvidia calls Native to with DLSS 3.0, but who knows if that's just coming from not enabling Reflex for Native and turning it on for DLSS 3.0 to make it look like you get a reduction. Also, where is DLSS 2 in this comparison? Those of us with Ampere won't be able to take advantage of DLSS 3 so I'd like to know how much of a benefit comes from DLSS 3.0 alone, specifically from the Frame Generation feature. If the DLSS 2 results look exactly like the DLSS 3 results except the frame rates are just simply half, then we all know what's going on: the higher frame rate comes from inserting those additional "fake" frames.
Obviously DLSS running at lower resolution, therefore latency will be low.
 

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
Well, show me the proof then otherwise I'm inclined to believe what the driver tells me. I don't have AMD card unlike you (well, I do have Ryzen APU as work laptop but I'm not going to try running games on it).
. . .

The driver happily lies to you:
Now that we’ve covered all of the AMD and Nvidia anti-aliasing driver options, we have some bad news: the driver-set options often don’t work.


Your driver settings are just suggestions, nothing more.
 

Saylick

Diamond Member
Sep 10, 2012
3,172
6,407
136
Obviously DLSS running at lower resolution, therefore latency will be low.
Precisely why I want to have the comparison, because there's a handful of features all being used simultaneously to make DLSS 3 look super good. Those features are:
- Nvidia Reflex: Should have minor improvements to latency, but still better than nothing
- DLSS Upscaling
- DLSS Frame Generation (exclusive to DLSS 3)

We need to understand the individual contributions of each using the same GPU to fully understand the benefits. Testing everything thoroughly is going to be a reviewer's nightmare.