Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 96 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Timorous

Golden Member
Oct 27, 2008
1,605
2,746
136
Outside of a vanishingly small and loyal AMD customer base, I just don't see it. If I am settling for an 8GB card, I'd pay the extra $50 or so and get the 4060 with the superior Nvidia software suite. The only way AMD can sell me a card is to offer me more vram for less money. As turning down textures is the most unacceptable of all visual compromises IMO. That's how you win my biz, and the reason I bought a 6800. There was nothing close in price with as much performance and vram. Acer A770 almost got my money, but I didn't want to tweak my main gamer that much. I will grab one still, when it hits my $300 target.

EDIT: 6800XT was only $50 more, but the extra power use was more than I was willing to deal with.

At that price point it is an extra 20% and it could buy you a bigger SSD or a higher core count CPU or more system ram. Also the 7600 will probably be a bit faster as well.

DLSS at 1080p is a bit rubbish, frame gen works better with higher frame rates and it uses more vram anyway so I don't see it as a huge advantage.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,446
20,440
146
At that price point it is an extra 20% and it could buy you a bigger SSD or a higher core count CPU or more system ram. Also the 7600 will probably be a bit faster as well.

DLSS at 1080p is a bit rubbish, frame gen works better with higher frame rates and it uses more vram anyway so I don't see it as a huge advantage.
Your scale is extremely well balanced for evaluating your purchasing power. 👍 I don't think that will be reflected by the mainstream. DLSS and frame generation are pushed hard = Sega Blast Processing! 😜 Add superior RT, NVENC, Broadcast, DLAA, and the rest of the software suite and I think it will be similar to the 6600 vs 3050. Significantly faster raster, cost less, and still is outsold by team green.
 

Timorous

Golden Member
Oct 27, 2008
1,605
2,746
136
Your scale is extremely well balanced for evaluating your purchasing power. 👍 I don't think that will be reflected by the mainstream. DLSS and frame generation are pushed hard = Sega Blast Processing! 😜 Add superior RT, NVENC, Broadcast, DLAA, and the rest of the software suite and I think it will be similar to the 6600 vs 3050. Significantly faster raster, cost less, and still is outsold by team green.

It is obvious the 4060 will outsell the 7600. AMD could offer the 7600 for $100 and some would still choose the 4060. I just think that at $249 the 7600 will be compelling enough to make it a good product.

Even better would be a 16GB variant at $299.
 
  • Like
Reactions: Lodix

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
Either way, whatever portion of people who used to buy every X generations are likely going to wait longer since Ada doesn't deliver the same perf/$ increase as previous generations. How much of an impact it has on Nvidia's bottom line won't be known until their next two earnings reports.

You can't attribute a change in financial results, to some specific theory you would like to confirm, in the best of times, let alone during a period of economic malaise where the whole sector is down.

In AMD's most recent report they went into the red with a GAAP net loss of $139 million. What pet theory would you like to confirm using that?
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,114
136
4060 is gonna be like red meat to all the 1060 users out there.

$299 equivalent launch price and a ~100% (?) performance bump are nothing to sneeze at for that crowd.
 
  • Like
Reactions: maddie

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,446
20,440
146
I just think that at $249 the 7600 will be compelling enough to make it a good product.
If we play it out, can you sell it to me over a 6700? I am curious what features or performance can be successfully promoted. I can't think of any. For a little more spend I get either more vram or the Nvidia software suite.
 
  • Like
Reactions: TESKATLIPOKA

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
It is obvious the 4060 will outsell the 7600. AMD could offer the 7600 for $100 and some would still choose the 4060. I just think that at $249 the 7600 will be compelling enough to make it a good product.

Even better would be a 16GB variant at $299.

16 GB is practically wasted on a 7600. It's really not anything other than a 1080p card and the types of games where it will have high frame rates are the e-sport titles that don't need 8 GB anyways. The games that won't run well because a card doesn't have more than 8 GB aren't going to run well on this card for other reasons.

The 4060 probably doesn't need 16 GB either, for much the same reason. 12 GB is where both should be at to ensure there's adequate room for future titles as that's what the console games are likely to target over the next 4 years.

The 4060 should just have been the 4060 Ti, because that unfortunate card should not exist because it's just going to tarnish titanium of all things, particularly as an 8 GB card. Even the bigger L2 cache won't save halving the bus size, especially when it's using slower memory than the GDDR6X 3060 Ti refresh. It can crunch more, but it actually has a lower pixel fill rate than the 3060 Ti at boost speeds. They could just done a repeat of the 1060 and made a better 4060 16 GB option instead.
 
  • Like
Reactions: TESKATLIPOKA

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
The 4060 probably doesn't need 16 GB either, for much the same reason. 12 GB is where both should be at to ensure there's adequate room for future titles as that's what the console games are likely to target over the next 4 years.

It is true that 12 GB is likely to be the target... but given that it's not clear if they will bother properly implementing DirectStorage,. there's certainly the possibilty that VRAM requirements are going to blow up. Esp when the CPU demands of doing the decompression are going to hurt worse on older CPUs.
 

Aapje

Golden Member
Mar 21, 2022
1,376
1,853
106
Outside of a vanishingly small and loyal AMD customer base, I just don't see it. If I am settling for an 8GB card, I'd pay the extra $50 or so and get the 4060 with the superior Nvidia software suite.
But you'll probably have to turn on more DLSS 2/3 to get to the same level of performance. I'd rather have more real frames.

And I think that right now, the logical markets for an 8 GB card are people getting a placeholder to wait until better prices and those with very little money. In both cases it's really valuable to pay less. A placeholder card should be cheap so you can easily move on from it and if you have very little money, then you shouldn't pay that much more for things like DLSS 3 that work poorly on a lower tier 1080p card.
 
  • Like
Reactions: coercitiv

Timorous

Golden Member
Oct 27, 2008
1,605
2,746
136
If we play it out, can you sell it to me over a 6700? I am curious what features or performance can be successfully promoted. I can't think of any. For a little more spend I get either more vram or the Nvidia software suite.

If both are available then power and size could swing it for a few people but the extra 2GB of ram on the 6700 is a killer feature.

The reality is the 6700 will go eol so they won't be available at the same time.

Vs the 4060 slightly better raster performance would swing it my way but I am a native or bust kind of person (or DLAA which I don't see being that viable on the 4060). I don't stream or do any productivity work on my computer so no advantages there. I do use multi monitor though and that does just work better on AMD.
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
It is true that 12 GB is likely to be the target... but given that it's not clear if they will bother properly implementing DirectStorage,. there's certainly the possibilty that VRAM requirements are going to blow up. Esp when the CPU demands of doing the decompression are going to hurt worse on older CPUs.

I'm not sure how DirectStorage matters when it still has be loaded into VRAM in the first place. Extra VRAM does mean more storage space so that can be done opportunistically, but even with a limited VRAM pool, you can still replace existing contents faster.

If you have a game that needs more than 8 GB it's going to crop up one way or another. Either texture quality gets sacrificed and visuals degrade or the memory has to swap textures in and there's stutters.

The mid-generation "pro" consoles will probably have the muscle to use 12 GB of textures so more games are going to target that as the minimum spec target.
 

Timorous

Golden Member
Oct 27, 2008
1,605
2,746
136
16 GB is practically wasted on a 7600. It's really not anything other than a 1080p card and the types of games where it will have high frame rates are the e-sport titles that don't need 8 GB anyways. The games that won't run well because a card doesn't have more than 8 GB aren't going to run well on this card for other reasons.

16GB is overkill but you only need to exceed 8GB for it to be of use and that is happening more and more at even 1080p so you can get a far smoother or far better IQ experience.

This is just 4GB Vs 8GB 480 again and the 8GB version had far longer legs.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
If we play it out, can you sell it to me over a 6700? I am curious what features or performance can be successfully promoted. I can't think of any. For a little more spend I get either more vram or the Nvidia software suite.

RX 6700 seems like one of those oddball cards that hardly anyone even knows exist, and when I check for 6700 shipped by Newegg. There are only 3 models, and one them is 51RISC, which appears to be refurb mining cards or something.

So I don't see these really being relevant in any major way.
 
  • Like
Reactions: DAPUNISHER

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
It is true that 12 GB is likely to be the target... but given that it's not clear if they will bother properly implementing DirectStorage,. there's certainly the possibilty that VRAM requirements are going to blow up. Esp when the CPU demands of doing the decompression are going to hurt worse on older CPUs.

The vast majority of the PC market still has 8GB VRAM or less and will for a long time. If a game can't do a decent job in 8GB it's going to get panned and suffer poor sales.

TLOU horrible shipping condition was immediately held up as poster child for VRAM usage increasing, when really it should have been held up as the poster child for horrible shipping condition.

Plus it's VRAM usage is already in a dramatically better state. Though a lot of the other bugs remain and some new ones were introduced (initial shader compilation is shorter, but now there are shader compilation stutters in game).

 

Timorous

Golden Member
Oct 27, 2008
1,605
2,746
136
The vast majority of the PC market still has 8GB VRAM or less and will for a long time. If a game can't do a decent job in 8GB it's going to get panned and suffer poor sales.

TLOU horrible shipping condition was immediately held up as poster child for VRAM usage increasing, when really it should have been held up as the poster child for horrible shipping condition.

Plus it's VRAM usage is already in a dramatically better state. Though a lot of the other bugs remain and some new ones were introduced (initial shader compilation is shorter, but now there are shader compilation stutters in game).


PC ports of console games are just extra gravy. They won't care that much.

If you match the PS5 settings the 8GB cards still can't handle it and why would you spend $399on a GPU that can't even match PS5 settings when the entire console is just $499 or $399 for the digital only version.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
PC ports of console games are just extra gravy. They won't care that much.

If they care enough to spend the money porting, they are going care that it sells well to cover the porting cost with profit on top. They cared enough to apologize for TLOU condition and release multiple patches so far, and likely many more to come.

If you match the PS5 settings the 8GB cards still can't handle it and why would you spend $399on a GPU that can't even match PS5 settings when the entire console is just $499 or $399 for the digital only version.

You can now use the same High textures as PS5 on an 8GB card, and medium is no longer a big step down.
 
  • Like
Reactions: psolord

Timorous

Golden Member
Oct 27, 2008
1,605
2,746
136
If they care enough to spend the money porting, they are going care that it sells well to cover the porting cost with profit on top. They cared enough to apologize for TLOU condition and release multiple patches so far, and likely many more to come.



You can now use the same High textures as PS5 on an 8GB card, and medium is no longer a big step down.

You need to use a different texture streaming setting that increase the amount of popin.
 

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
If you have a game that needs more than 8 GB it's going to crop up one way or another. Either texture quality gets sacrificed and visuals degrade or the memory has to swap textures in and there's stutters.

Theoretically it would be able to refill the VRAM without a noticeable studder if it was fast enough.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
PC ports of console games are just extra gravy. They won't care that much.

If you match the PS5 settings the 8GB cards still can't handle it and why would you spend $399on a GPU that can't even match PS5 settings when the entire console is just $499 or $399 for the digital only version.

Also VRAM isn't the only issue. A lot of the recent ports were a mess even on a RTX 4090. So why buy a $1600 GPU that can't even match a PS5??

Given the quality of ports, the only thing that can assure you a 100% console experience, is a buying a console.
 
  • Like
Reactions: TESKATLIPOKA

Timorous

Golden Member
Oct 27, 2008
1,605
2,746
136
"Normal" showed pop-in.

The "Fast" setting showed no pop-in. He used High + Fast and didn't have any issues on an 8GB card.

He doesn't properly label which scene is using which settings. Video is useless. Are they running at 1080p, 1440p, DLSS to 1440p. Some scene it says in others there is no info

Also the high patched Vs launched comparison looks to me like high texture settings have seen a slight downgrade. Very subtle but looks a little more blurred, especially on that concrete.
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
I never said it was ZERO.

But you can't equate GPU fanatics, who spend their days on GPU forums excited for the next GPU, with normal people.
People with cards that are several generations old, as has always been the case.

Whether the sub 1% that used to buy cards every generation keep doing that or not is kind of irrelevant, since they are such a tiny minority of buyers that appeasing them has NEVER been a priority, and has never moved the needle on sales enough to matter.

Correct, "sub 1%" to be precise. World of a difference, obviously.
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
16 GB is practically wasted on a 7600. It's really not anything other than a 1080p card and the types of games where it will have high frame rates are the e-sport titles that don't need 8 GB anyways. The games that won't run well because a card doesn't have more than 8 GB aren't going to run well on this card for other reasons.

The 4060 probably doesn't need 16 GB either, for much the same reason. 12 GB is where both should be at to ensure there's adequate room for future titles as that's what the console games are likely to target over the next 4 years.

The 4060 should just have been the 4060 Ti, because that unfortunate card should not exist because it's just going to tarnish titanium of all things, particularly as an 8 GB card. Even the bigger L2 cache won't save halving the bus size, especially when it's using slower memory than the GDDR6X 3060 Ti refresh. It can crunch more, but it actually has a lower pixel fill rate than the 3060 Ti at boost speeds. They could just done a repeat of the 1060 and made a better 4060 16 GB option instead.
We need to throw out the old rule book. The old Vram/shader power balance is shifting. It appears that the new techniques being used will change the ratio in favor of more Vram needed relative to computing power. All the Dev interviews I've seen, seem to be saying this. I'm talking 1080p here.
 

Aapje

Golden Member
Mar 21, 2022
1,376
1,853
106
We need to throw out the old rule book. The old Vram/shader power balance is shifting. It appears that the new techniques being used will change the ratio in favor of more Vram needed relative to computing power. All the Dev interviews I've seen, seem to be saying this. I'm talking 1080p here.
I think that's overly dramatic. It's always been true that some things scale with resolution, but many things, like greater viewing distances do not. Developers are now simply using the available RAM on the PS5/XBox.
 
  • Like
Reactions: KompuKare

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
He doesn't properly label which scene is using which settings. Video is useless. Are they running at 1080p, 1440p, DLSS to 1440p. Some scene it says in others there is no info

Also the high patched Vs launched comparison looks to me like high texture settings have seen a slight downgrade. Very subtle but looks a little more blurred, especially on that concrete.

You are grasping at straws here. The high textures are the same, and they take the same amount of memory on the same settings.

The difference for High textures is the tunable setting for "Texture Streaming Rate". When the game was released this wasn't tunable, and the rate seemed to be locked at what is now called the "Fastest" setting which made it stutter on an 8GB card, but the new "Fast" setting seems to be tuned very well for 8GB GPUs, enabling high textures without stutter or pop in. The new "Normal" setting seems to be the new default, which causes pop-in.