Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 61 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jpiniero

Lifer
Oct 1, 2010
14,585
5,208
136

Apparent alleged benchmarks of the 4080 12 GB and 16 GB provided by nVidia. The 12 GB is close to the 3090 Ti but not quite. This is at 4K so that might be part of it, given the memory bandwidth. The 16 GB is 25-30% faster than the 12.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136

Apparent alleged benchmarks of the 4080 12 GB and 16 GB provided by nVidia. The 12 GB is close to the 3090 Ti but not quite. This is at 4K so that might be part of it, given the memory bandwidth. The 16 GB is 25-30% faster than the 12.

They are definitely provided by NVidia, and line up with the first release numbers. Also the first release numbers also line up with 4090 reviews, so it does seem like this is about where things will be.

Videocardz, just put them all together in one table. It really highlights that the 4080 12GB should be a 4070 or 4070 Ti. It's only 10% to 20% faster than a 3080, but it is a more generational upgrade of about 30%-40% over the 3070, but you can't sell a 070 card for $900...

RTX4080-PERF-GRAPH-1.png
 

jpiniero

Lifer
Oct 1, 2010
14,585
5,208
136
Videocardz, just put them all together in one table. It really highlights that the 4080 12GB should be a 4070 or 4070 Ti. It's only 10% to 20% faster than a 3080, but it is a more generational upgrade of about 30%-40% over the 3070, but you can't sell a 070 card for $900...

The 12 GB probably does do better at lower resolutions.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,483
2,352
136
They are definitely provided by NVidia, and line up with the first release numbers. Also the first release numbers also line up with 4090 reviews, so it does seem like this is about where things will be.

Videocardz, just put them all together in one table. It really highlights that the 4080 12GB should be a 4070 or 4070 Ti. It's only 10% to 20% faster than a 3080, but it is a more generational upgrade of about 30%-40% over the 3070, but you can't sell a 070 card for $900...

RTX4080-PERF-GRAPH-1.png
Yeah, 4090 looks good relative to 3090/3090ti. The cards down the stack on the other hand look absolutely terrible. The 4080 pricing is atrocious. Hoping AMD can provide sane pricing for mid/budget range.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
DF: DLSS 3 frame generation analysis.

I was very critical of the DF preview of DLSS 3, as it was clear to me, they were hamstrung by NVidia NDA about what they were allowed to say. That one felt like NVidia marketing.

This one is much more critical, as the NDA handcuffs are off:

 

gdansk

Platinum Member
Feb 8, 2011
2,078
2,559
136
Nvidia left a large performance gap between the 4080 (16GB) and the 4090. Which I bet AMD will fill. So maybe we'll see an early 4080 Ti this time.
 

Yosar

Member
Mar 28, 2019
28
136
76
I don't know if we're just looking at different reviews, but in practice, it doesn't seem to be drawing 100 W more compared to the 3090.

Here we are.

CP2077 RT On, 4K, Max settings 477 W, and spikes to almost 500 W.

Maximum Consumption 468W against RTX 3090 363W. Over 100W more. The same as this freak of the nature 3090Ti.
With 52% (!) better performance in 4K than 3090. And Ampere wasn't even good in efficiency.

And I don't care if in Mine Sweeper it only draws 50 W. I can play Mine Sweeper on smartphone. My Power Supply must be ready for at least 500 W. And it is not (750 W).
So yeah, despite all the propaganda I don't see so much better efficiency than in Ampere. With much improved process. Something went wrong.

Maybe all these tensor and RT cores are now bigger burden than contribution. Who knows.
 
  • Like
Reactions: Tlh97

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Here we are.

CP2077 RT On, 4K, Max settings 477 W, and spikes to almost 500 W.

Maximum Consumption 468W against RTX 3090 363W. Over 100W more. The same as this freak of the nature 3090Ti.
With 52% (!) better performance in 4K than 3090. And Ampere wasn't even good in efficiency.

And I don't care if in Mine Sweeper it only draws 50 W. I can play Mine Sweeper on smartphone. My Power Supply must be ready for at least 500 W. And it is not (750 W).
So yeah, despite all the propaganda I don't see so much better efficiency than in Ampere. With much improved process. Something went wrong.

Maybe all these tensor and RT cores are now bigger burden than contribution. Who knows.

Also worth noting that the transients on the 4090 are very different from those on the 3090. The transients on the 4090 are smaller, but they last longer. Where the 3090/Ti had these larger, but much shorter transients. This is per the testing GN did.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
Here we are.

CP2077 RT On, 4K, Max settings 477 W, and spikes to almost 500 W.

Maximum Consumption 468W against RTX 3090 363W. Over 100W more. The same as this freak of the nature 3090Ti.
With 52% (!) better performance in 4K than 3090. And Ampere wasn't even good in efficiency.

And I don't care if in Mine Sweeper it only draws 50 W. I can play Mine Sweeper on smartphone. My Power Supply must be ready for at least 500 W. And it is not (750 W).
So yeah, despite all the propaganda I don't see so much better efficiency than in Ampere. With much improved process. Something went wrong.

Maybe all these tensor and RT cores are now bigger burden than contribution. Who knows.

Pretty amazing if you turn on V-Sync, 60Hz monitor.

It consumes just 76W, 3W less than a 6500XT.
power-consumption.png


And it is the most efficient card they have tested, and you can see it's a fairly big jump:


energy-efficiency.png
 
  • Like
Reactions: Vattila

blckgrffn

Diamond Member
May 1, 2003
9,123
3,057
136
www.teamjuchems.com
Does nvidia have the equivalent of Radeon Chill? I set many non-demanding games to ~90 fps max via the drivers because there isn't really any benefit to running them max speed when my personal threshold for response time being met.

This should be a reasonably good way of tamping down the transients too. It's tough because its a title by title type item to tweak, though.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,451
20,462
146
Does nvidia have the equivalent of Radeon Chill? I set many non-demanding games to ~90 fps max via the drivers because there isn't really any benefit to running them max speed when my personal threshold for response time being met.

This should be a reasonably good way of tamping down the transients too. It's tough because its a title by title type item to tweak, though.
I use Chill, cool feature. My 3060ti you do it through the control panel, and it is just called frame rate limiter.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,057
136
www.teamjuchems.com
I use Chill, cool feature. My 3060ti you do it through the control panel, and it is just called frame rate limiter.

I appreciate that Chill is easy/obviously enabled when you start a game via the Radeon Overlay and that you can access it via the Overlay at any time. If nvidia doesn't have it that easy yet, I hope they consider it soon. The Radeon Overlay stuff has been a really great feature for the last couple of years, IMO.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
They are all over ebay for $2500-$4000. I mean lol, come on. It seems like basically everyone who bought one of these did so just to scalp it, haha. Ebay is flooded with them. I think next time Nvidia needs to just charge $3500 from the start.
 

repoman0

Diamond Member
Jun 17, 2010
4,471
3,311
136
They are all over ebay for $2500-$4000. I mean lol, come on. It seems like basically everyone who bought one of these did so just to scalp it, haha. Ebay is flooded with them. I think next time Nvidia needs to just charge $3500 from the start.

You should buy them all up, wait 29 days, and then return them opened and used with an “item not as described” claim.
 
  • Like
Reactions: Lodix

fleshconsumed

Diamond Member
Feb 21, 2002
6,483
2,352
136
They are all over ebay for $2500-$4000. I mean lol, come on. It seems like basically everyone who bought one of these did so just to scalp it, haha. Ebay is flooded with them. I think next time Nvidia needs to just charge $3500 from the start.
Eh, when I looked online this morning there were 4090 available. They're gone now, but it's not like a year ago when your only realistic option was bestbuy and you had to camp discord channels and had to have all payment details preloaded into bestbuy site just to have a chance of getting a card and you had to be on bestbuy within 15 seconds of when notification hit. If you were super lucky to live around microcenter you could also camp at 4am on an off chance it's one of the video card delivery days. I don't see that happening now. Sure, they're sold out, but they were actually available this morning and you didn't even need to camp discord. I suspect after the initial "I must have it" wave is over there will be plenty of availability. I do think there may be availability issues for the midrange/budget stuff though as there will be a lot more demand for those.
 

CP5670

Diamond Member
Jun 24, 2004
5,510
588
126
They are all over ebay for $2500-$4000. I mean lol, come on. It seems like basically everyone who bought one of these did so just to scalp it, haha. Ebay is flooded with them. I think next time Nvidia needs to just charge $3500 from the start.

I want one but have zero desire to fight with these scalpers. I'll wait until they are readily available.
 
  • Like
Reactions: Tlh97 and Leeea

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
HWUB analyzes DLSS 3 Frame interpolation. He has a similar skeptical view to my own.

Tim notes: Between the visual artifacts, and increased latency, it's really hard to come up with a case where you would use this.
You need to run very high base framerate 120 FPS before you really remove much of DLSS 3 penalties, at which point, you need it less, and if your frame rate exceeds monitor, you will get tearing.

He also says we really shouldn't consider this "real performance".

IMO, it's a mediocre "solution" in search of a problem:
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
The 4080, in both iterations at their prices, looks abysmal for performance. They delivered huge on the 4090 and priced accordingly. The 4080 12gb is for sure going to be slower than 3090ti/3090 in quite a few games at 4K. Not sure what they're thinking there price wise. I get 4090, and will probably wind up getting one even at that price once they are in stock and not scalped, it is so fast, but the 4080 looks awful at those prices.

4090 also has quite a few disabled units, leaving lots of room for a 4090ti. Will have to see if they are putting aside chips with further disabled units for a 4080ti of some sort. This launch feels to me like one of those where nvidia ends up adjusting prices downwards in a few months. If those scalping POS end up holding the bag on 4090s, will be a good indicator that prices may end up going down. 4090 scalpers are testing the waters, there is no mining market this time around. How many gamers are up for spending $3K on a scalped gaming GPU, I don't know. Hopefully they get screwed over hard and there is no market for those rip off scalped prices.
 
Last edited: