Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 73 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
That perf/watt figure can be very misleading considering Series X GPU is running right at perf/watt sweet spot while 5700XT is stepped a bit out of it in order to be competitive against its Nvidia counterparts.
Who knows if only adding 100mhz to series X GPU could jack up the power figure above 200w?

Even if you take mobile GPUs like RX 5500M (22CU, 1408sp, Game clock - 1448 Mhz, 4.08TF, 85w) the Xbox Series X is 1.7x the perf/watt of mobile RDNA. So thats a fair comparison as you can consider Series X to be a mobile GPU which is optimized for sweet spot on the V/F curve.

Sony's Mark Cerny made a statement that reducing the GPU freq by 10% cuts power by 27% when asked whether the PS5 GPU will actually be able to run most of the time at 2.23 Ghz.


Put simply, with race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time. Cerny also stresses that power consumption and clock speeds don't have a linear relationship. Dropping frequency by 10 per cent reduces power consumption by around 27 per cent. "In general, a 10 per cent power reduction is just a few per cent reduction in frequency," Cerny emphasises.
 
  • Like
Reactions: Saylick

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Why exactly? I thought a big reason for Samsung was no lack of wafer supply. I actually expect the opposite this time. If AMD has a true competitor this round, then it's in Nvidia's best interest to flood the market with cards.
Corona + high demand. I think these cards will be hard to get for at least a few months.
 

JasonLD

Senior member
Aug 22, 2017
487
447
136
Even if you take mobile GPUs like RX 5500M (22CU, 1408sp, Game clock - 1448 Mhz, 4.08TF, 85w) the Xbox Series X is 1.7x the perf/watt of mobile RDNA. So thats a fair comparison as you can consider Series X to be a mobile GPU which is optimized for sweet spot on the V/F curve.

Also, I wouldn't solely rely on TF numbers and perf/watt to estimate possible performance figure for top of the line Navi. All the TFlops numbers from RDNA2 might not translate to actual performance figures in terms of traditional rasterization performance.
 
  • Like
Reactions: NTMBK

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Also, I wouldn't solely rely on TF numbers and perf/watt to estimate possible performance figure for top of the line Navi. All the TFlops numbers from RDNA2 might not translate to actual performance figures in terms of traditional rasterization performance.

amd-rdna2-efficiency-improvements-png.27820


This is what AMD has said about RDNA2. We will see how RDNA2 vs Ampere plays out. Till then I disagree with you.
 

Attachments

  • AMD RDNA2 Efficiency improvements.png
    AMD RDNA2 Efficiency improvements.png
    646.7 KB · Views: 298
  • Like
Reactions: Saylick

DiogoDX

Senior member
Oct 11, 2012
747
279
136
We don't know the XBOX power and people are already extrapolating the Big Navi power from it. Only thing we know is that it has a 315W PSU. Actual power consumption could be as high as 250W. Just because the ONE X peaks at 175-180W doesn't mean that the Series X will be the same.
 
  • Like
Reactions: FaaR

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Can't wait for the 3000 series to come out so I can replace the temporary (sold my 2080TI) AMD 580 I'm using that keeps black screening and freezing up my system. Hopefully there's good competition and more reasonable pricing, we'll see.

Seems like every AMD GPU launch is hype hype hype and then a poor execution. Hopefully more $$$ is being thrown into that division and AMD regains some of it's former glory in the space.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
We don't know the XBOX power and people are already extrapolating the Big Navi power from it. Only thing we know is that it has a 315W PSU. Actual power consumption could be as high as 250W. Just because the ONE X peaks at 175-180W doesn't mean that the Series X will be the same.


Xbox Series X Power supply.png

First if you did see the Series X teardown and did some basic analysis you would come up with similar conclusion. Dual board design. Dual PSU rails. +12v 21.5A, +12v 5A. Mainboard with SoC, 16 GB GDDR6 memory, onboard 1TB NVMe, external NVMe connector and HDMI 2.1 connects to the +12v 21.5A rail. The daughter I/O board connects to the other +12v 5A rail.

90% efficiency of PSU and assuming a load of 90% (worst case) gives 206w.
8C/16T at 3.66 Ghz draws 54w from Renoir measurements.
4w onboard 1 TB NVMe + 4w for external NVMe + 2w HDMI.

Roughly 150w for GPU + 16 GB GDDR6 (asymmetric setup with 10GB at 560 GB/s and 6 GB at 336 GB/s). The GPU on its own is drawing roughly 120w.
 
Last edited:

MrTeal

Diamond Member
Dec 7, 2003
3,614
1,816
136
First if you did see the Series X teardown and did some basic analysis you would come up with similar conclusion. Dual board design. Dual PSU rails. +12v 21.5A, +12v 5A. Mainboard with SoC, 16 GB GDDR6 memory, onboard 1TB NVMe, external NVMe connector and HDMI 2.1 connects to the +12v 21.5A rail. The daughter I/O board connects to the other +12v 5A rail.

90% efficiency of PSU and assuming a load of 90% (worst case) gives 206w.
8C/16T at 3.66 Ghz draws 54w from Renoir measurements.
4w onboard 1 TB NVMe + 4w for external NVMe + 2w HDMI.

Roughly 150w for GPU + 16 GB GDDR6 (asymmetric setup with 10GB at 560 GB/s and 6 GB at 336 GB/s). The GPU on its own is drawing roughly 120w.
If the output of the PSU is 21.5A on the rail it doesn't matter what the PSU efficiency is, there's a rated 258W available to the motherboard.
 
  • Like
Reactions: Campy

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
Here we go again... After all the hype, NV killer guerilla marketing, Glo's raving (I do miss RussianSensation) Nvidia are going to announce and launch their cards before AMD has even gotten a finalized design down (for all we know).

Execution machines, those guys. It will be even sadder if they actually are 100% on the Samsung 8/10nm train for their consumer parts. Working with a brand new partner for large dies and still getting the stack out the door before we even hear a peep from AMD.
If you have inferior product it is very important for you to release it faster than your competition.
 
  • Haha
  • Like
Reactions: Mopetar and raghu78

Konan

Senior member
Jul 28, 2017
360
291
106
If you have inferior product it is very important for you to release it faster than your competition.

First movers attract eager buyers, secondary movers usually go when demand begins to grow and they have an inferior product trying to build off the success of the first.
In this case, with Nvidia we're already seeing smart marketing with the 21 days for 21 years ago approach. A 3 week build up to an announcement followed by a launch has all been planned out. Product segregation over 30-60-90 day plans. No fear. The market is most anxious for a product right now and that is when you launch. Funny it is usually this time that happens, back to school leads to thanksgiving leads to black Friday etc all way till holiday season. This year we have consoles to deal with so better to get out in front of all that.

Bet you that AMD wishes they were launching right now. But no, their board design was late, drivers are late, launch is late and no AIB partners. That smells of poor planning and execution. That said the advantage they have is pricing but the more it is left late the more damage is done.
 
  • Like
Reactions: DXDiag

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
If the output of the PSU is 21.5A on the rail it doesn't matter what the PSU efficiency is, there's a rated 258W available to the motherboard.

Even if we go with that 258w number the max power draw on the Series X SoC is expected to be around 70-75% of the max available power draw. These PSUs are never loaded upto their max rated limit to account for degradation in rated wattage over expected lifespan. These consoles are expected to last 7-10 years.


Xbox One X drew a max of 180w (172w in Gears of War) with a 245w rated PSU. I would say 220w is a worst case scenario for the Series X mainboard. More realistically 200-210w.
 
Last edited:

FaaR

Golden Member
Dec 28, 2007
1,056
412
136
Why not??? you know something we dont ??
Sony is clocking the hell out of their GPU, right up to the bleeding edge of reliability, according to Cerny. That's going to bump power draw significantly, I'm really curious how much it'll end up being. PS5 is a very large console from what we've seen, could be they needed to engineer a big, sophisticated cooling system to keep the noise level down while still dissipating a significant wattage. Sort of like with launch PS3 units.

90% efficiency of PSU and assuming a load of 90% (worst case) gives 206w.
PSU (in)efficiency goes on top of rated output. It doesn't subtract from it... :)
 

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
Bet you that AMD wishes they were launching right now. But no, their board design was late, drivers are late, launch is late and no AIB partners. That smells of poor planning and execution. That said the advantage they have is pricing but the more it is left late the more damage is done.
Are you sure about that?

Then why Nvidia is RUSHING the launch of next gen GPUs? There is ONE reason why we see those GPUs so early.

Ask yourself, why is Nvidia rushing this launch, if AMD has inferior products? Do they have inferior products?
 

Konan

Senior member
Jul 28, 2017
360
291
106
Are you sure about that?

Then why Nvidia is RUSHING the launch of next gen GPUs? There is ONE reason why we see those GPUs so early.

Ask yourself, why is Nvidia rushing this launch, if AMD has inferior products? Do they have inferior products?

I don’t think that they are worried about AMD as much as you think they are. You’re making some massive assumptions to fit your conscious narrative.

I believe that they will not have inferior products. I think that they will be competitive and have a great looking stack. It will be progress over their last iteration, but it won’t be perfect and that’s ok.

I think that they will have inferior software. I also think that if they could choose to launch with AIB partners they would.

Both companies are dealing with global issues across their lines of business. The sooner the better.

I don’t believe for a second that it is coincidence with the 21 days 21st year etc. that stuff is planned out months prior (once the product is cooked)
 

Konan

Senior member
Jul 28, 2017
360
291
106
There's been a record 2 years without a new top performing card. I haven't had to wait this long for 10+ years. That's certainly not a rush. I'm sure they would have loved to earn more money selling an upgrade to 2080Ti 6 or 12 months ago.
Exactly, totally agree.

There are way many more reasons that are logical Vs “rushing to launch because scared of AMD”
 

Karnak

Senior member
Jan 5, 2017
399
767
136
I don’t think that they are worried about AMD as much as you think they are. You’re making some massive assumptions to fit your conscious narrative.
There's a reason for why you will see SKUs with double the amount of VRAM which was originally planned. And why there will be a 24GiB SKU which is not a Titan like the Titan RTX but rather the new 2080Ti. With 13GiB more of VRAM. Hint: It's not because Nvidia loves their customers so much.
 

Konan

Senior member
Jul 28, 2017
360
291
106
There's a reason for why you will see SKUs with double the amount of VRAM which was originally planned. And why there will be a 24GiB SKU which is not a Titan like the Titan RTX but rather the new 2080Ti. With 13GiB more of VRAM. Hint: It's not because Nvidia loves their customers so much.

How do you know what was originally planned? Someone’s opinion? Somebody’s leak? Do you work for Nvidia?

If what you are saying is true.... That is down to competition not timing. The only timing is the ability to be nimble and adapt quickly to market environments.

As far as I’m aware there was a 24 GB card at the high end that was always leaked as planned. That was a 3090.
The only source of a 20 GB card SKU was wccftech with kopitekimi saying possible with a new sku.

Edit: seems the 20G card rumour has legs today via. VideoCardz and Chiphell.
 
Last edited:

JasonLD

Senior member
Aug 22, 2017
487
447
136
Are you sure about that?

Then why Nvidia is RUSHING the launch of next gen GPUs? There is ONE reason why we see those GPUs so early.

Ask yourself, why is Nvidia rushing this launch, if AMD has inferior products? Do they have inferior products?

Almost 2 years since first Turing GPUs came out and Turing refresh came out July of last year. I don't see any more than Nvidia sticking to their usual launching schedule.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Hint: It's not because Nvidia loves their customers so much.

"NVidia killer" or bust. 20% faster across the board for AMD, is the only option, all of their credibility is tied up on this point. Every bit of hype, all of the viral efforts, everything comes down to this. The credibility of all their hard-working .... fans.... is all tied up on this point.

Besides, we know nVidia doesn't have any consumer cards coming this year, they don't have any consumer Ampere coming at all, and if they did it would need a six hundred watt power connector.

The time is close, we'll see how well everyone speculated.
 

Bouowmx

Golden Member
Nov 13, 2016
1,147
551
146
So much edging and denial with the next generation of graphics cards, a very high level of hype for NVIDIA to move past 16/12 nm, and AMD (Big Navi) to properly compete with NVIDIA's best. Hype since 2018.