Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 104 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
293
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
GA104 is 220W at 95% of the performance of a 2080 Ti? The TDP of a 2080 Ti is 260W, so that doesn't seem like all that big of a perf/W improvement...
Again. KittyCorgi claimed its 3072 ALU GPU that has 95% of RTX 2080 Ti performance.

According to Kopite, that 3072 ALU GPU with GDDR6X is RTX 3070 Ti, and it has not 220W TDP, but 250W TDP. RTX 2070 has 2944 ALUs, and has 220W TDP. RTX 3070 also has GDDR6, according to Kopite.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
fresh today:



The 3090 is ~56% faster than the 2080Ti.

The 3090 is ~18% faster than the 3080.

The 3080 is ~33% faster than the 2080Ti.

The 3080 is ~70% faster than the 2080.


But hey, someone will insist it's only 35% because reasons ..
Looking at how Time Spy scales, historically, on GPUs, expect more of 35-40% increase in games, from RTX 2080 Ti to 3090.

3080 therefore could be in games around 10-15%(20%, best case scenario) faster, than RTX 2080 Ti.
 

Krteq

Golden Member
May 22, 2015
1,008
725
136
Irregular shaped PCB seems to be confirmed

nvidia-geforce-rtx-307ukd3.jpg



VCZ - NVIDIA confirms 12-pin power connector and V-shaped PCB for GeForce RTX 30 series
 
  • Like
Reactions: uzzi38 and Glo.

A///

Diamond Member
Feb 24, 2017
4,351
3,160
136
Am I missing an inside joke? It's an interesting video. What surprised me the most was how the fans work because it wasn't what most people thought it would be. It's a great marketing video to future proof high energy use cards, of course, but I'm more curious how it'll affect case temperatures in relation to CPU temps.
 

DXDiag

Member
Nov 12, 2017
165
121
116
Looking at how Time Spy scales, historically, on GPUs, expect more of 35-40% increase in games, from RTX 2080 Ti to 3090.
Nope. Time Spy Extreme is a decent indicator for general scaling. Expect even more than 57% in games.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
nVidia video about how amazing nVidia is and how amazing the Ampere cooler is.

Did you watch the video? The ampere cooler is not shown anywhere. We do see the power connector, and the PCB. But that's it. This is basically just general "cooling video cards is hard" video. It has some mildly interesting bits, but thats all really.

EDIT: Or perhaps you for your /s at the end ;)
 
  • Like
Reactions: ozzy702

CastleBravo

Member
Dec 6, 2019
185
424
136
Did you watch the video? The ampere cooler is not shown anywhere. We do see the power connector, and the PCB. But that's it. This is basically just general "cooling video cards is hard" video. It has some mildly interesting bits, but thats all really.

EDIT: Or perhaps you for your /s at the end ;)

This looks like Ampere to me.

ampere.jpg
 

Saylick

Diamond Member
Sep 10, 2012
3,967
9,270
136
Nvidia actually releasing a video on how designing a thermal solution is hard, especially now out of all possible times, really sounds like they are trying to do expectation management before people realize the enormous TDPs their new cards. Total damage control moment right here.
 

A///

Diamond Member
Feb 24, 2017
4,351
3,160
136
Nvidia actually releasing a video on how designing a thermal solution is hard, especially now out of all possible times, really sounds like they are trying to do expectation management before people realize the enormous TDPs their new cards. Total damage control moment right here.
Which is really funny considering they were bragging about efficiency a few generations ago.
 
Last edited:
  • Like
Reactions: ozzy702

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Which is really funny considering they were bragging about efficiency a few generations ago.

Maybe they're going for a clean sweep this go around. Price, power usage, temps, and performance....Oops forgot size Size matters!
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Nvidia actually releasing a video on how designing a thermal solution is hard, especially now out of all possible times, really sounds like they are trying to do expectation management before people realize the enormous TDPs their new cards. Total damage control moment right here.

Obviously when they are going to deliver the hottest single GPU card ever beating the GTX 480 by a fair margin in terms of TDP this kind of marketing and PR exercises are prone to happen. This GPU generation will change the perception that Nvidia designs the most efficient GPU architectures and AMD cannot compete against them. I am left thinking what performance improvement will Nvidia deliver in MaxQ notebook GPUs if their perf/watt improvement is going to be small. That Samsung 8nm process might have a far worse V/f curve than traditional TSMC processes which were simply fantastic for the lower portions of the V/f curve.
 

Mopetar

Diamond Member
Jan 31, 2011
8,460
7,677
136
3080 is too expensive. 3060/3070 need 10GB/11GB. 1070 had 8GB four years ago for $379.

I know we've had a lot of speculation and even a few leaks of varying degrees of believability, but we don't have final prices yet.

It's hard to say the 3080 is too expensive when we're guessing on both price and performance. I suppose NVidia has been able to surprise with higher numbers for a while now so there's a certain amount of expectation.


Which is really funny considering they were bragging about efficiency a few generations ago.

Every company talks up the importance of whatever their strong suit happens to be. If they have raw performance then expect them to push that.

Last year's marketing has no bearing on this year's marketing. Trying to bring it up to them is pointless. The people in marketing are like eel politicians.
 

Saylick

Diamond Member
Sep 10, 2012
3,967
9,270
136
Obviously when they are going to deliver the hottest single GPU card ever beating the GTX 480 by a fair margin in terms of TDP this kind of marketing and PR exercises are prone to happen. This GPU generation will change the perception that Nvidia designs the most efficient GPU architectures and AMD cannot compete against them. I am left thinking what performance improvement will Nvidia deliver in MaxQ notebook GPUs if their perf/watt improvement is going to be small. That Samsung 8nm process might have a far worse V/f curve than traditional TSMC processes which were simply fantastic for the lower portions of the V/f curve.
Yeah, I'm not surprised that Nvidia even made that video because they are the king of marketing after all. They are selling the concept as "Our goals was to push as much power, and thus performance, out of our cards as possible, and so we had to jump through extra hoops to make it work". Average consumer probably sees that as, "Oh cool, Nvidia want to give us more performance no matter the cost and had to design a new cooler to enable that" while others like myself see it as "Oh, we couldn't improve the perf/W high enough so that our new generation of cards have a big enough performance delta over the last generation, so we had to crank up the clocks and power to get there." I have no doubt in my mind that if Nvidia sourced TSMC N7P for Ampere consumer GPUs, they wouldn't have been in this dilemma. In my mind, there is easily a 20% perf/W improvement between SS 8nm and TSMC N7P, or enough where the RTX 3090 could have been a 300W card, not 350W.
 
Last edited:
  • Like
Reactions: raghu78