Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 134 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

Tup3x

Golden Member
Dec 31, 2016
1,086
1,084
136
Although not the double from 2080 to 3080 that Jensen said in the presentation. He should have said, double the performance in 2 or 3 titles. On average it seems to fluctuate between 65% and 80%.

That is still VERY impressive, please don't misunderstand me :)
Well, "performance" is rather vague term. Depends entirely what you are measuring. Nevertheless, the actual real word performance gains are impressive so they did do rather nice job.
 
  • Like
Reactions: lobz

eek2121

Diamond Member
Aug 2, 2005
3,100
4,398
136
AMD will have RTX, I am not sure about a similar tech to Tensor cores and DLSS. They will have to price It accordingly or they will have a problem selling their cards. We will see. I am pretty happy with Ampere except It was not built using 7nm process.

It won’t matter if AMD has a tensor core equivalent or not as long as they price their GPUs appropriately.

RTX 2070S is faster than RX 5700 XT and 80CUs doesn't mean 2x more performance than Navi 10 because of not linear scaling. It all depends on clocks and IPC gain If any compared to RDNA1.

Except the PS5 GPU with 36CUs at “up to” 2.23Ghz with 135W power consumption beats the 2070S quite easily.

People are going to be shocked when AMD does the reveal.

Where it does matter is Perf:Watt. Which nVidia pushed on people for years as being extremely important. Its clear now that they are trying to move people away from thinking that. But it doesn't change the fact that having a GPU dump 350-400W of power into your case can cause significant cooling issues for a lot of people.

Most of the hot air is dumped out the back of the GPU. Their solution makes complete sense to me. Who knows better? Engineers at NVIDIA or armchair engineers on AT forums?
 
  • Love
Reactions: spursindonesia

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
The performance gains in RT seems to be quite muted. Maybe in Cyberpunk 2077 it'll be different, but as it stands its a difference between 70% and maybe 85%.
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
The performance gains in RT seems to be quite muted. Maybe in Cyberpunk 2077 it'll be different, but as it stands its a difference between 70% and maybe 85%.


That is the most suprising thing for me. If anything I was expecting some seriuos fraud ( read number massageing) with "up to 666% more performance" claims in one specific scene with RTX + DLSS 3.1 enabled. Yet the opposite has happened -> raw performance is there, but RTX/DLSS gains are rather muted?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
@JoeRambo @exquisitechar Damn its those speculators talking about 4x improvement that misled everyone.

I'm assuming maybe its because current Ray Tracing implementations are pretty light. Which is why I'm thinking Cyberpunk 2077 will gain more. If not, then Nvidia decided balanced gains are better than going "hur dur RT cores!" or something. :p
 
  • Like
Reactions: exquisitechar

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Most of the hot air is dumped out the back of the GPU. Their solution makes complete sense to me. Who knows better? Engineers at NVIDIA or armchair engineers on AT forums?

For the record, I have worked on actual engineering projects. And pretty sure I just sat on a stool at my bench.

And while the FE version may dump a percentage of the heat out the back, the AIB cards do not. As we have already seen photos of them. And AIB cards typically consume more power than FE cards, while at the same time making up the majority of card sales.
 

Tup3x

Golden Member
Dec 31, 2016
1,086
1,084
136
For the record, I have worked on actual engineering projects. And pretty sure I just sat on a stool at my bench.

And while the FE version may dump a percentage of the heat out the back, the AIB cards do not. As we have already seen photos of them. And AIB cards typically consume more power than FE cards, while at the same time making up the majority of card sales.
Most AIB cards have, quite frankly, brain dead designs. I wouldn't be surprised if NVIDIA's cooler ends up being better in real world use cases (closed case, not test bench).
 
  • Like
Reactions: Stuka87

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
RT core only improves by 70% according to Nvidia. Yup. All those speculators were wrong. 4x was bandied around so often that we took it as if it was straight from Nvidia.
1598986517743.png

Very very impressive, I don't remember such a large increase of gpu power gen on gen from any gpu company ever. But maybe I'm wrong.

Indeed you are, but not to be faulted since last time they got that was in 2006 with Geforce 8800: https://www.anandtech.com/show/2116
 

linkgoron

Platinum Member
Mar 9, 2005
2,409
979
136
I honestly don't think AMD will be able to compete with 3070 and 3080. You guys are being bit too optimistic considering AMD track record in graphics card. I first need to see a price cut on 5700XT to $300 to start things and $200 on 5600XT.
Call me optimistic, but I don't see how AMD won't be able to compete with the 3070 and 3080. The 3090 is clearly out of reach, but if the 2080ti is currently 50% faster than the 5700xt at 4k, when the 5700xt is 251mm^2, I don't see how a newer architecture at double the size won't be able to beat the 2080ti by 20%-30%.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
They are claiming 1.9x perf/watt, so they just chose to push performance on the top end. I wouldn't be surprised if we see some nice high end mobile GPUs out of Ampere.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
Without the power efficiency though

I always buy performance. OEMs might care a little bit about efficiency.

I had two 7990's in Xfire. Not the most efficient would you say? Performance.

I had 3x 290's in Tri-fire as well. Performance.

With that said I'm just gonna grab a 3080. I just don't game enough anymore to go ultra high-end this time.
 

blckgrffn

Diamond Member
May 1, 2003
9,300
3,441
136
www.teamjuchems.com
And it wowed everyone at the time. Pascal versus Maxwell was a nice increase. Looks like Nvidia did it again.

Haha, right.

GTX 980 TDP ~ 165W
GTX 1080 TDP ~ 180W

If we had the perf uplift at equivalent watts it would be as impressive. 15 watts is a wash - 125+? That's like bolting an entire extra GTX 980 on.

Adding to the power budget massively to hit the same ratio? Less impressive.

(which is not to say that I am still pleased at the "performance"/price point $500 3070)


I was just re-reading that for power numbers and wow, the GTX 980 was pretty much a beast - quite, cool, fast. Lowest power consumption, highest performance.