• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 97 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Timorous

Golden Member
Oct 27, 2008
1,977
3,861
136
You are grasping at straws here. The high textures are the same, and they take the same amount of memory on the same settings.

The difference for High textures is the tunable setting for "Texture Streaming Rate". When the game was released this wasn't tunable, and the rate seemed to be locked at what is now called the "Fastest" setting which made it stutter on an 8GB card, but the new "Fast" setting seems to be tuned very well for 8GB GPUs, enabling high textures without stutter or pop in. The new "Normal" setting seems to be the new default, which causes pop-in.

Not grasping any straws. The video shows various texture settings and their vram usage but does not say what resolution that is at. It also states a texture setting without specifying if it is environmental textures or one of the other settings, the assumption would be that high means all of those sub settings are set to high but without it being explicitly stated there is room for ambiguity which does not need to be there.

PS5 runs the game at 1440p High at around 70-80 FPS unlocked or it can run at 4k High at around 30 - 45 fps. That is around RX 6800 tier performance and the 3070/3070Ti simply falls away at those same settings and the same will happen to the 4060Ti 8GB and 4060. In this case the PS4 GPU is punching above its weight because in raw specs it is a 6700 with a bit more VRAM.

For as good as many people claim DF is, and they are for pixel peeping and console comparisons, when it comes to PC they seem to make a lot of rudimentary mistakes when presenting information.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,696
3,260
136
For the 4060, it's actually the positive standout of this generation, because it gets a 20% performance bump, and a price cut.

I'm pretty happy with the 4060, and will likely get one, if the AIB pricing holds to MSRP.
You forgot that It has 4GB less memory.
If the memory size didn't change, then It would likely cost $349(+$50).
So basically increased price, but still a better perf/$ at least in Full HD and much better power consumption.
The question is how will It perform in higher resolution vs RTX 3060 12GB.

If we compared the current generation, then I think this GPU will be better than N33.
Features: AD107
Vram: Draw
Raster performance: either draw or N33 better by a few %
OC: I think AD107, so raster would end up in a draw
RT performance: AD107
Power consumption: AD107
Price: N33
Even If the price difference was $50, that's a pretty small amount considering I will would have to make a new PC.
It's not like AD107 is so great, N33 looks just that bad to me.
Next week we will see.

@Aapje : I edited It a bit, It should make more sense now.
 
Last edited:

Aapje

Golden Member
Mar 21, 2022
1,530
2,106
106
Even If the price was $50, that's a pretty small amount considering I would have to make a new PC.
Can you explain what you mean by this? I don't understand why you would have to make a new PC for the 7600, but not the 4060.
 
Mar 11, 2004
23,444
5,852
146
Could the issue at hand be that games are being designed with consoles in mind, where they both have more available VRAM than a LOT of dGPU, combined with dedicated decompression hardware, and then the leg up on the asset streaming? Expectation being that DirectStream and faster PC CPUs will make it a non-issue especially in the future, with them being forced to get ports out ASAP now.

I don't think there's many real repercussions for pushing out poor ports, since gamers still buy the games, and poor release states is basically almost expected at this point.
 
Jul 27, 2020
28,173
19,203
146
What's annoying is that these are major game studios. Game development is their bread and butter and they still haven't figured out how to make better cross platform games? Though admittedly, most of these games are from Sony, who probably had no idea they were going to release these games on the PC in the future. It's like one day, someone showed them PC sales projections in a powerpoint slide and the top dog there went, woof! All our AAA titles must get ported to PC. Woof!
 
Jul 27, 2020
28,173
19,203
146
I don't think there's many real repercussions for pushing out poor ports, since gamers still buy the games, and poor release states is basically almost expected at this point.
I bet it's almost a fun activity for some, as they try different tricks to try to make the game run better. It's like tinkering with the engine of the car, without the accidents :D
 

Heartbreaker

Diamond Member
Apr 3, 2006
5,165
6,786
136
Not grasping any straws. The video shows various texture settings and their vram usage but does not say what resolution that is at. It also states a texture setting without specifying if it is environmental textures or one of the other settings, the assumption would be that high means all of those sub settings are set to high but without it being explicitly stated there is room for ambiguity which does not need to be there.

It was grasping at straws to claim the high quality textures changed, when they were identical. Likely because you didn't pay attention, saw less memory being used.

Initially, he did label them as "Environment Textures", but then he just started using Low, Medium, High as shorthand.

This is a follow-up video. So almost certainly all the settings, other than textures changed here, are the same as as in the original video. Go Watch that one. IIRC he was running 1440p with DLSS quality and High everything, except textures, which is the one thing he change here.

PS5 runs the game at 1440p High at around 70-80 FPS unlocked or it can run at 4k High at around 30 - 45 fps. That is around RX 6800 tier performance and the 3070/3070Ti simply falls away at those same settings and the same will happen to the 4060Ti 8GB and 4060. In this case the PS4 GPU is punching above its weight because in raw specs it is a 6700 with a bit more VRAM.

For as good as many people claim DF is, and they are for pixel peeping and console comparisons, when it comes to PC they seem to make a lot of rudimentary mistakes when presenting information.

But it's NOT falling away because of VRAM now, because that was corrected with the new settings. It's just lower performance because of excellent optimization on the PS5, and poor optimization on the PC. That poor optimization would make it slow whether you had 8GB or 16 GB of VRAM.

As I mentioned before. In the light of poor PC ports, the only way to guarantee a 100% console experience is to buy a console. More VRAM won't change that.
 
  • Like
Reactions: psolord

Heartbreaker

Diamond Member
Apr 3, 2006
5,165
6,786
136
You forgot that It has 4GB less memory.
If the memory size didn't change, then It would likely cost $349(+$50).
So basically increased price, but still a better perf/$ at least in Full HD and much better power consumption.
The question is how will It perform in higher resolution vs RTX 3060 12GB.

I didn't forget. I just don't think it matters that much for a card at this price point, and the only reason 3060 had 12GB was bus size forced it. I'd much rather have a 4060 than a 3060. Though I have a couple of 3060's in my Amazon Wishlist, and I might buy one of those if a really good sale popped (under $400 CAD) while I'm waiting.

Given the poor state of PC game launches in general. I wouldn't buy a game, until it's been bug fixed and tested, and I don't obsessively insist on having every game (If I did, I'd need a PC/XB/PS/Switch), so if it's never adequately fixed, I would never buy it. This would apply regardless of how much VRAM I had. I think it's stupid to reward bad ports with an early full price purchase.

Even If the price difference was $50, that's a pretty small amount considering I will would have to make a new PC.
It's not like AD107 is so great, N33 looks just that bad to me.
Next week we will see.

I'm just looking for a card at this point, and if $50 is the difference between otherwise equal AMD/NVidia card, then $50 isn't much for the extra NVidia features. I'm betting RDNA 4 cards will be closer to feature parity - using tech from the Xilinx acquisition.
 

Timorous

Golden Member
Oct 27, 2008
1,977
3,861
136
It was grasping at straws to claim the high quality textures changed, when they were identical. Likely because you didn't pay attention, saw less memory being used.

Initially, he did label them as "Environment Textures", but then he just started using Low, Medium, High as shorthand.

This is a follow-up video. So almost certainly all the settings, other than textures changed here, are the same as as in the original video. Go Watch that one. IIRC he was running 1440p with DLSS quality and High everything, except textures, which is the one thing he change here.



But it's NOT falling away because of VRAM now, because that was corrected with the new settings. It's just lower performance because of excellent optimization on the PS5, and poor optimization on the PC. That poor optimization would make it slow whether you had 8GB or 16 GB of VRAM.

As I mentioned before. In the light of poor PC ports, the only way to guarantee a 100% console experience is to buy a console. More VRAM won't change that.

5:23 into the video, high vs high. The backpack is so much better in the launch version. You can see the weave of the canvas where as in the new version that has gone. The very fine details are more blurred in the new version as well.

f7jcRAk.jpg


Tell me honestly the backpack looks the same to you. Everything about it looks fuzzier in the new version and there is simply less detail.

5:29 look at the curb, it is very subtle but it is not as crisp or the white brick above the puddle, it has marginally less definition.

5:43 same thing, the divots on the concrete have more banding on the shadow and the curb again is a little fuzzier. Same with the crack along the bottom, it is less defined. As does the wood texture above where in the launch version you can see more of the grain vs the new version.

5:47 the concreate on the right of the scene just looks a little worse in the new version, again slightly less definition.

So yes they have 100% downgraded the 'high' textures very slightly. nothing you would notice in gameplay but when you stop and zoom in there is a difference.It is almost as though they have gone through all of them and saved them at 95% quality to save a small amount of space on each one. Some you can't tell a difference and others it is very very subtle.

EDIT: Got a response on the DF video. This is running with DLSS and the new version has a different default sharpening level to the launch version. Not sure why they would use DLSS for a texture comparison and introduce yet another variable to muddy up the comparison but they do some really odd things at times.
 
Last edited:

Heartbreaker

Diamond Member
Apr 3, 2006
5,165
6,786
136
5:23 into the video, high vs high. The backpack is so much better in the launch version. You can see the weave of the canvas where as in the new version that has gone. The very fine details are more blurred in the new version as well.

f7jcRAk.jpg


Tell me honestly the backpack looks the same to you. Everything about it looks fuzzier in the new version and there is simply less detail.

5:29 look at the curb, it is very subtle but it is not as crisp or the white brick above the puddle, it has marginally less definition.

5:43 same thing, the divots on the concrete have more banding on the shadow and the curb again is a little fuzzier. Same with the crack along the bottom, it is less defined. As does the wood texture above where in the launch version you can see more of the grain vs the new version.

5:47 the concreate on the right of the scene just looks a little worse in the new version, again slightly less definition.

So yes they have 100% downgraded the 'high' textures very slightly. nothing you would notice in gameplay but when you stop and zoom in there is a difference.It is almost as though they have gone through all of them and saved them at 95% quality to save a small amount of space on each one. Some you can't tell a difference and others it is very very subtle.

EDIT: Got a response on the DF video. This is running with DLSS and the new version has a different default sharpening level to the launch version. Not sure why they would use DLSS for a texture comparison and introduce yet another variable to muddy up the comparison but they do some really odd things at times.

Yes, I can see the difference in the backpack. The game uses DOF, and viewing things in front of Joel, has him softened by DOF, so small position difference can affect the DOF blur on areas NOT in focus. The brick wall is the focus, not Joel.

5:29/5:43. You are dreaming. They are obviously the same textures. Placebo and confirmation bias.

It's ludicrous to think they took the textures. Degraded them slightly, then saved them at exactly the same size. Why on earth would they do that?
 
Last edited:

Timorous

Golden Member
Oct 27, 2008
1,977
3,861
136
Yes, I can see the difference in the backpack. The game uses DOF, and viewing things in front of Joel, has him softened by DOF, so small position difference can affect the DOF blur on areas NOT in focus. The brick wall is the focus, not Joel.

5:29/5:43. You are dreaming. They are obviously the same textures. Placebo and confirmation bias.

It's ludicrous to think they took the textures. Degraded them slightly, then saved them at exactly the same size. Why on earth would they do that?

As DF said they were running DLSS which has different default sharpening between versions. No idea why they would introduce a variable when they could have easily avoided it but guess it means that I was seeing an actual difference rather than placebo.
 

Heartbreaker

Diamond Member
Apr 3, 2006
5,165
6,786
136
As DF said they were running DLSS which has different default sharpening between versions. No idea why they would introduce a variable when they could have easily avoided it but guess it means that I was seeing an actual difference rather than placebo.

They are using DLSS because the game requires it to get playable frame rates with a 2070s.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,696
3,260
136
I didn't forget. I just don't think it matters that much for a card at this price point, and the only reason 3060 had 12GB was bus size forced it. I'd much rather have a 4060 than a 3060. Though I have a couple of 3060's in my Amazon Wishlist, and I might buy one of those if a really good sale popped (under $400 CAD) while I'm waiting.
It's still a disadvantage compared to previous generation.
Removing 64-bit bus + 4GB Vram, helped in keeping cost to $299, but die size would have been too small to put 192-bit bus instead.
RTX 3060 was not limited to only 12GB, Nvidia could have used 6x 1gb chip for 6GB like they did in laptops. Surprised, they didn't do that, was It because of crypto?
 

Heartbreaker

Diamond Member
Apr 3, 2006
5,165
6,786
136
It's still a disadvantage compared to previous generation.
Removing 64-bit bus + 4GB Vram, helped in keeping cost to $299.
RTX 3060 was not limited to only 12GB, Nvidia could have used 6x 1gb chip for 6GB like they did in laptops. Surprised, they didn't do that, was It because of crypto?

It's pretty obvious, they didn't think 6GB was enough, so they were essentially forced to 12GB.

8GB they did think was enough because 3060 Ti, 3070, and 3070 Ti all have 8GB. Even the 3080 had only 10GB vs 12GB on 3060. So 12GB was clearly just forced on them due to the bus size, as the only reasonable way to get the required 8GB.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,696
3,260
136
It's pretty obvious, they didn't think 6GB was enough, so they were essentially forced to 12GB.

8GB they did think was enough because 3060 Ti, 3070, and 3070 Ti all have 8GB. Even the 3080 had only 10GB vs 12GB on 3060. So 12GB was clearly just forced on them due to the bus size, as the only reasonable way to get the required 8GB.
This doesn't make much sense.
If 6GB was not enough, why did they use It in laptops? It's not like they used only 1/2 of memory chips. To keep 192-bit bus they had to use 6 memory chips and for 12GB the number would still be the same only with bigger capacity.
 

jpiniero

Lifer
Oct 1, 2010
16,838
7,284
136
This doesn't make much sense.
If 6GB was not enough, why did they use It in laptops? It's not like they used only 1/2 of memory chips. To keep 192-bit bus they had to use 6 memory chips and for 12GB the number would still be the same only with bigger capacity.

Save money for the OEM.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,049
32,561
146
RX 6700 seems like one of those oddball cards that hardly anyone even knows exist, and when I check for 6700 shipped by Newegg. There are only 3 models, and one them is 51RISC, which appears to be refurb mining cards or something.

So I don't see these really being relevant in any major way.
You're not wrong. While our gang is familiar with it. If it isn't on the shelf in stores, or their friends don't have one, I doubt most average gamers have any idea the 6700 exists. And indeed, that 51RISC is nothing more than a reconditioned Powercolor Fighter. It is why they blacked out the name in GPU-Z as the lone review mentions. You can get the 6700 at Amazon as well, the XFX version is $290, $299 at Newegg.

I only used the 6700 for debating purposes because it points out what a wet blanket the 7600 is. It brings nothing worthwhile for budget gamers IMO. More vram was the only feature that could have made it an attractive purchase in mid 2023. The 4060 will crush it worse than the 3050 has done to the 6600.

I am at a loss why it seems compelling to anyone at $249. Some models of 6650XT have dropped below $225 multiple times and they don't even sell out. As someone else said, it's like AMD isn't even trying anymore.
 
Jul 27, 2020
28,173
19,203
146
I am at a loss why it seems compelling to anyone at $249.
Compelling to those who can't stand Geforce or not too demanding of their cards. A lot of people don't spend time tinkering with settings for the best fps. As long as it works, they are fine with it and if their previous card was an AMD, they would have little reason to go with a Geforce, unless the AMD card fails them spectacularly.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,049
32,561
146
Compelling to those who can't stand Geforce or not too demanding of their cards. A lot of people don't spend time tinkering with settings for the best fps. As long as it works, they are fine with it and if their previous card was an AMD, they would have little reason to go with a Geforce, unless the AMD card fails them spectacularly.
That's the vanishingly small loyal customer base I referred to previously. Beyond that, I should think anyone that prefers AMD already has RDNA2. What's the 7600 got that is going to make me pull the trigger? 4060 has a long list of features, including exclusive ones. That sells cards. It's why OEM demand things like calling B450 B550A ;)