Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 124 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

Saylick

Diamond Member
Sep 10, 2012
3,532
7,859
136

Saylick

Diamond Member
Sep 10, 2012
3,532
7,859
136
Because there might be AIB/OEM models with trash tier bin silicon.
Seems like a whole lotta variability to me if there's trash tier silicon that barely musters the advertised boost of ~1700 MHz and then there's the possibility that you get silicon that boosts above 2000 MHz out of the box without an OC. I now see why AMD started to advertise a Game Clock, as it appears easier to comprehend what the consumer can roughly expect when it comes to typical clocks.
 
  • Like
Reactions: Gideon

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Gainward released leaks...

RTX 3090 Phoenix
  • CUDA cores: 5248
  • Clock speed: 1695 MHz (Boost)
  • Memory: 24GB GDDR6X
  • Memory clock: 9750 MHz
  • Bandwidth: 936 GB/s
  • PCIe: Gen 4
  • Max power consumption: 350W
  • Output: HDMI 2.1, DisplayPort 1.4a
RTX 3080 Phoenix
  • CUDA cores: 4352
  • Clock speed: 1710 MHz (boost)
  • Memory: 10GB GDDR6X
  • Memory clock: 9500 MHz
  • Bandwidth: 760 GB/s
  • PCIe: Gen 4
  • Max power consumption: 320W
  • Output: HDMI 2.1, DisplayPort 1.4a

Im seriously LMAFOing at that 3090 GPU Mem. 24GB on a single card, with 350W....
Im also wondering when we'll start seeing more DP1.4a monitors.... we are given so many DP ports, yet almost none of them can do HDR over DP unless its 1.4.

Then the rtx3070 has 8gb.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I'm still surprised at the 30W TDP between the two SKUs even though they are clocked about the same, but yet the 3090 has 20% more cores and memory chips. I feel like the real TDP of the 3090 has to be closer to 375W...

nVidia is known for not correctly listing total power consumption. Difference between TDP and TBP. nVidia typically uses TDP, which is the power consumed by the GPU itself, minus memory/board losses. The 2080Ti for instance consistently uses 30-40W more than nVidia claims while gaming. TPU measured the founders edition (250W TDP) at 273W average gaming, and 289W under peak gaming.

If nVidia is claiming 350W TDP for the 3090, people need to expect it to use near 400W.
 
Last edited:

CP5670

Diamond Member
Jun 24, 2004
5,539
613
126
Im seriously LMAFOing at that 3090 GPU Mem. 24GB on a single card, with 350W....
Im also wondering when we'll start seeing more DP1.4a monitors.... we are given so many DP ports, yet almost none of them can do HDR over DP unless its 1.4.

On the other hand, 10GB for the 3080 is a little disappointing. That would actually be a downgrade for 1080ti owners, even though it's probably not noticeable in practice.
 

Gideon

Golden Member
Nov 27, 2007
1,774
4,145
136
On the other hand, 10GB for the 3080 is a little disappointing. That would actually be a downgrade for 1080ti owners, even though it's probably not noticeable in practice.
Rumor is, there will be 20GB cards, just not at launch. I definitely wouldn't want to go lower than 12GB, 16GB if possible with new consols with ultra fast SSD streaming (not available yet on PC) coming up. Hopefully 3070 also has a 16 GB option or Big Navi, if competitive.
 
  • Like
Reactions: psolord

kurosaki

Senior member
Feb 7, 2019
258
250
86
TAA is a lossy technique that has a typical "smearing" effect, especially around edges and high contrast areas. DLSS is trying to take a low Res pic upscale it and bumps up the local contrast in the same areas. It's like gaming through a bad Instagram filter. Some may not notice, good for you. Some can't unsee what they have seen and therefor imagedistorting techniques like these won't be turned on.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,760
1,455
136
Back picture puts rest the stupid co-processor theory.

SOOO stupid. That weird claim that the cooler design somehow was for cooling the back of the PCB (it isn't and doesn't), and therefore there must be a co-processor back there. It was so mind numbingly dumb that I had a passing thought that CoreTeks might have had a real source who told him about a co-processor, and he had just obfuscated the thing by shovelling a heap of fake speculation on top of it. I guess not though.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Jesus, 350W power consumption... and people gave AMD a hard time for the 275W Fury X.

I know! My only really rational thought about that is that producing a chip with that level of TDP is basically facilitated by a primary usage as a workstation card. With a secondary effect of getting a huge gaming GPU.

Or maybe enough people really don't care. We'll see soon enough.
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
Nearly 400w dumped in the case is big trouble. Might be the generation where reference card is smarter choice than 3rd party.

IF NV cooling is exhausting large percentage of the heat via backplate, and does so at reasonable noise levels i don't really care about power as long as performance lead is there.

What i am worried is 3rd party cards, that are "cool" by virtue of having 3 large fans and massive heatsinks, but dump 100% of heat in the case...
 

amenx

Diamond Member
Dec 17, 2004
4,107
2,379
136
Jesus, 350W power consumption... and people gave AMD a hard time for the 275W Fury X.
I dont think power draw is an issue if the card delivers performance commensurate with it. Vega 64 vs GTX 1080 (100w less) for roughly same performance is a good example. Thats where AMD gets flak for.
 
  • Like
Reactions: lightmanek