Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 178 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
293
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

MrTeal

Diamond Member
Dec 7, 2003
3,917
2,704
136
As far as I know none of the DC projects require very large amounts of VRAM but correct me if I'm wrong.

Between the two options, I would say if PCIe slot space isn't an issue then 2x3080s would probably be the better buy but if slot space is an issue, then 3090 is probably better. Once the full system power is taken into account, the 3090 will probably be at best a wash in efficiency but at much lower performance. Even if it is a slight efficiency win it would be at much lower performance and I doubt the efficiency difference over the lifetime of the cards would make up for the big performance gap.
It will also depend what you’re using for a motherboard. With a 3090, you’re likely limited to 2 cards unless you use water cooling or risers given the triple slot cooler. You could put 4 3080s into a HEDT system with four double spaced x16 slots.
 

beginner99

Diamond Member
Jun 2, 2009
5,315
1,760
136
Not that I can easily think how the steam survey results could be systematically badly biased when used for measuring GPU market share.

I can. People that prefer open tech vs proprietary tech. These will more likley own an AMD GPU. If something like open tech even interests you, you are most likley more technically aware and think more about freely sharing your data regardless of disclaimers made. Hence if you own AMD you will more likley decline the survey especially so over some random intel HD4000 user. Even if the effect is small, over the millions of people surveyed it will have an effect.
Not saying it's true but it's easy to come up with plausible explanations.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I can. People that prefer open tech vs proprietary tech. These will more likley own an AMD GPU. If something like open tech even interests you, you are most likley more technically aware and think more about freely sharing your data regardless of disclaimers made. Hence if you own AMD you will more likley decline the survey especially so over some random intel HD4000 user. Even if the effect is small, over the millions of people surveyed it will have an effect.
Not saying it's true but it's easy to come up with plausible explanations.
But you have no evidence of that. I can't believe AMD fans don't want their card recorded by steam - look at any "what gpu do you want/own" poll you see fairly regularly in news sites, AMD does just fine there, in fact I would say better then they should if we look at the actual distribution of gpu sales given by Peddle/Steam/etc.
 
  • Like
Reactions: ozzy702

Shivansps

Diamond Member
Sep 11, 2013
3,917
1,570
136
Not this again, i had enoght and im going to say it. People complaining about the Steam survey are doing so because it does not applies to what they belive is correct or they are deliberately dismissing based only on their opinion.
Just like some years ago some people attacked Cinebench results and others defended them, and now it reversed.

I can. People that prefer open tech vs proprietary tech. These will more likley own an AMD GPU. If something like open tech even interests you, you are most likley more technically aware and think more about freely sharing your data regardless of disclaimers made. Hence if you own AMD you will more likley decline the survey especially so over some random intel HD4000 user. Even if the effect is small, over the millions of people surveyed it will have an effect.
Not saying it's true but it's easy to come up with plausible explanations.

Not only that makes zero sence, it is not true, as the RX480 is the most used GPU on Linux in the Steam Survey
 

linkgoron

Platinum Member
Mar 9, 2005
2,598
1,238
136
There are obviously a few issues with the Steam survey. For example, what does it take when you have Steam installed on multiple machines? I've had the survey pop-up 4-5 times on my laptop (with an Nvidia GPU), while it only poped-up once on my desktop PC with a 5700XT. So, users with Steam installed on multiple machines might bias the survey (as Nvidia totally owns the laptop market, for example), and AMD has also blamed "internet cafes" a few years ago. However, IMO, even if the numbers are inaccurate (and looking at the numbers I'm pretty sure that there's some inherent bias for Nvidia) - I would argue that in essence the picture is accurate, Nvidia totally owns the GPU market. It might be 70-30 instead of 80-20, but I'm pretty sure Nvidia is dominating the market.
 
Last edited:
  • Like
Reactions: lightmanek

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
So, back to video card discussion...

The updated drivers from nVidia seem to have greatly increased card reliability at the cost of performance (Which I speculated at a few pages back). I have not seen any benchmarks yet, but users are reporting a 20-30Mhz drop in boost clocks. Which means the initial reviews show performance that isn't entirely true anymore.

 
  • Like
Reactions: lightmanek

Heartbreaker

Diamond Member
Apr 3, 2006
5,034
6,609
136
So, back to video card discussion...

The updated drivers from nVidia seem to have greatly increased card reliability at the cost of performance (Which I speculated at a few pages back). I have not seen any benchmarks yet, but users are reporting a 20-30Mhz drop in boost clocks. Which means the initial reviews show performance that isn't entirely true anymore.

https://videocardz.com/newz/nvidia-...s-report-fewer-crashes-after-updating-drivers

That's ~1%-1.5% difference in clock speed. Margin of error stuff.
 
  • Like
Reactions: Konan and psolord

sze5003

Lifer
Aug 18, 2012
14,304
675
126
So, back to video card discussion...

The updated drivers from nVidia seem to have greatly increased card reliability at the cost of performance (Which I speculated at a few pages back). I have not seen any benchmarks yet, but users are reporting a 20-30Mhz drop in boost clocks. Which means the initial reviews show performance that isn't entirely true anymore.

Figured they would do this as it's the cheapest option. I am still waiting for a 3080 with a bit more ram.

Do you have a list of the known vendors that used good caps? I think asus is one of them. Just in case a 20gb model never comes out and I would then just have to go for a 3080.
 

Det0x

Golden Member
Sep 11, 2014
1,461
4,980
136
I cant believe you guys are still discussing this steam hardware survey debacle, i thought this had been explained/debunked years ago.


The reason the Steam hardware survey went nuts back in 2017 is that Steam was over-counting every single individual login at an iCafe as another instance of that computer’s system configuration. Imagine if you had 10 of your best friends over to play Steam games on your PC. Every single one of them logged in to their own accounts — which led to 10 copies of your system config being uploaded to the platform and counted as separate submissions.

That’s basically what happened with Steam. And according to AMD, while the company made some corrections to its data, Valve has never been particularly concerned with making sure its numbers track real-life market share. AMD, meanwhile, is drastically underrepresented in iCafe gaming.

AMD-CPU-Share.png

To bolster his point, Herkelman released images like the above, showing how AMD CPU adoption rates changed dramatically when Steam added PUBG in China, then changed again afterward when the company re-updated its algorithms. In both cases, AMD’s market share was lower as a result of the update.

The error involves Steam counting every individual login at internet cafes (which are particularly popular in Asian countries) as separate PC configurations. So, if 10 people log into Steam on a single Intel-powered PC at an internet café – and opts in to the Steam Hardware Survey – then the survey will count it as 10 separate machines.

It’s the only data set of its type available publicly. But this could explain why AMD’s overall CPU market share numbers have been ticking up in other reports but have remained fairly static on Steam. If Chinese iCafe installations grow more quickly than other types of gaming and AMD isn’t represented in that market, it’s not going to appear to gain much market share in either CPU or GPUs. We’ve mostly used the SHS to compare generational adoption for Turing versus Pascal, which should be less-impacted. But if the figures for AMD adoption are incorrect, the figures for at least some Nvidia SKUs will be as well.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I cant believe you guys are still discussing this steam hardware survey debacle, i thought this had been explained/debunked years ago.

Actually we had moved on, and you just brought it back up :(
 

Heartbreaker

Diamond Member
Apr 3, 2006
5,034
6,609
136
It’s the only data set of its type available publicly. But this could explain why AMD’s overall CPU market share numbers have been ticking up in other reports but have remained fairly static on Steam. If Chinese iCafe installations grow more quickly than other types of gaming and AMD isn’t represented in that market, it’s not going to appear to gain much market share in either CPU or GPUs. We’ve mostly used the SHS to compare generational adoption for Turing versus Pascal, which should be less-impacted. But if the figures for AMD adoption are incorrect, the figures for at least some Nvidia SKUs will be as well.

Yes, there was a problem with internet cafes, fixed 3 years ago.

AMD CPU share on Steam is not static. It increases every month.
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Not sure if this was brought up already but Hardware Unboxed mentioned in one of their videos that the crashing issue wasn't exclusive to the cards with the POSCAPs as their Asus Tuf and FE cards experienced the same problems.

Has any other reviewer mentioned this?
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Not sure if this was brought up already but Hardware Unboxed mentioned in one of their videos that the crashing issue wasn't exclusive to the cards with the POSCAPs as their Asus Tuf and FE cards experienced the same problems.

Has any other reviewer mentioned this?

LTT had an issue with I think it was their ASUS or FE also. But it turned out to be the PowerSupply they were using. Which was according to nVidia, large enough for the task, but ran into issues at very specific load amounts.
 
  • Like
Reactions: Elfear

Bouowmx

Golden Member
Nov 13, 2016
1,150
553
146
WCCFTech but testing GeForce RTX 3080 undervolt with the NVIDIA PCAT tool (not relying on software power values).

Results in Forza Horizon 4, 4K Ultra:
Stock ~ 1955 MHz:
143 fps, 310 W

862 mV ~ 1905 MHz:
139 fps, 230 W
97% performance for 74% power

806 mV ~ 1815 MHz:
134 fps, 201 W
94% performance for 65% power

Also included for reference: 2080 Ti, but it was not undervolted.

all-3080-overall.png
 

MrTeal

Diamond Member
Dec 7, 2003
3,917
2,704
136
That's ~1%-1.5% difference in clock speed. Margin of error stuff.
While it is a small difference, it's not really margin of error stuff when it shifts the whole average downward.

WCCFTech but testing GeForce RTX 3080 undervolt with the NVIDIA PCAT tool (not relying on software power values).

Results in Forza Horizon 4, 4K Ultra:
Stock ~ 1955 MHz:
143 fps, 310 W

862 mV ~ 1905 MHz:
139 fps, 230 W
97% performance for 74% power

806 mV ~ 1815 MHz:
134 fps, 201 W
94% performance for 65% power
Be interesting to see actual card power measured externally and across a sample of cards, but that's a pretty amazing drop. I would imagine one of the better coolers like the ASUS TUF that already runs cool at 35dB with stock settings would be almost inaudible over background at 200W.
1601393566844.png
 
  • Like
Reactions: Stuka87

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Nice to see a sample of course but it's very hard to believe that those undervolts can be achieved universally. Such a huge drop.

They'd surely have shipped at the lower TDP & left the over clocking for the AIB cards.

Can definitely believe that the 250w was their original design target. Might hit it on a refresh.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
WCCFTech but testing GeForce RTX 3080 undervolt with the NVIDIA PCAT tool (not relying on software power values).

Results in Forza Horizon 4, 4K Ultra:
Stock ~ 1955 MHz:
143 fps, 310 W

862 mV ~ 1905 MHz:
139 fps, 230 W
97% performance for 74% power

806 mV ~ 1815 MHz:
134 fps, 201 W
94% performance for 65% power

Also included for reference: 2080 Ti, but it was not undervolted.


I plan on undervolting my 3080. Should have it next week if Amazon is accurate with the current delivery date.
 

Hitman928

Diamond Member
Apr 15, 2012
6,648
12,273
136
Nice to see a sample of course but it's very hard to believe that those undervolts can be achieved universally. Such a huge drop.

They'd surely have shipped at the lower TDP & left the over clocking for the AIB cards.

Can definitely believe that the 250w was their original design target. Might hit it on a refresh.

My problem with the undervolting claims is that, in my experience, it doesn't pan out as advertised. At least that was my experience on AMD cards when undervolting was all the craze. I tried it on multiple 480s and Vega. I was able to undervolt all of them but none of them undervolted as much as what I'd seen posted on the internet without running into stability issues and one of the 480s barely undervolted at all without major stability issues, I'm talking like <5% undervolt. My best undervolt card was probably Vega56 which I was able to get a decent undervolt on, but again, not nearly as low as I had seen others reportedly get. I could go that low and be stable in many games for hours, but then certain games would crash after an hour or two of gaming. If I only played those stable games, I'd never even know the undervolt wasn't fully stable.

Long story short, AMD and I'm sure Nvidia are setting the voltages where they are for a reason. Most likely because they need to in order to guarantee stability while yielding as many die as possible and even when reviewers test undervolt settings, unless they are checking for stability across many games checking for hours on each game, I just don't trust that they are actually stable, just like I don't trust reviewer CPU overclock stability as they typically don't really check long term stability due to time constraints.
 

Mopetar

Diamond Member
Jan 31, 2011
8,447
7,647
136
Nice to see a sample of course but it's very hard to believe that those undervolts can be achieved universally. Such a huge drop.

They'd surely have shipped at the lower TDP & left the over clocking for the AIB cards.

Can definitely believe that the 250w was their original design target. Might hit it on a refresh.

It's not that hard to believe and reminds me a lot of what we saw with AMD cards when Polaris/Vega first came out. You could give them a fairly substantial undervolt and save a lot of power without losing that much performance. Some people even saw a performance increase from a slight undervolt, which sounds absurd the first time you hear it, but was borne out by multiple sources.

I think NVidia just ran into the same sort of problem where the architecture just can't go any further on the current process without just throwing a lot of power at it for highly diminished returns. Would anyone have seriously cared if a 3080 only had 97% of the current performance if it came in at 230W? Releasing such a card with a lower power limit might have at least let people hold out hope for more powerful SUPER refresh with higher clock speeds and power limits later on, but what we have makes that prospect look doubtful unless the Samsung process sees considerable improvements.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
My problem with the undervolting claims is that, in my experience, it doesn't pan out as advertised. At least that was my experience on AMD cards when undervolting was all the craze. I tried it on multiple 480s and Vega. I was able to undervolt all of them but none of them undervolted as much as what I'd seen posted on the internet without running into stability issues and one of the 480s barely undervolted at all without major stability issues, I'm talking like <5% undervolt. My best undervolt card was probably Vega56 which I was able to get a decent undervolt on, but again, not nearly as low as I had seen others reportedly get. I could go that low and be stable in many games for hours, but then certain games would crash after an hour or two of gaming. If I only played those stable games, I'd never even know the undervolt wasn't fully stable.

Long story short, AMD and I'm sure Nvidia are setting the voltages where they are for a reason. Most likely because they need to in order to guarantee stability while yielding as many die as possible and even when reviewers test undervolt settings, unless they are checking for stability across many games checking for hours on each game, I just don't trust that they are actually stable, just like I don't trust reviewer CPU overclock stability as they typically don't really check long term stability due to time constraints.

Yeah, its very hit and miss. AMD (or in this case nVidia) set the voltages that high for a reason, which typically is to have more chips that pass verification.

With my RX 480, I got really lucky. I got the card on release day of the Nitro+, and I was able to drop 100mV off it, and ran it for years with no issues. And it made a noticeable difference in how loud the fans got. But I have tried to help others, and some of them could not lower the voltage at all without crashing issues.

I really question the stability of the 3080 claims though, because those wattage drops are gigantic.
 
  • Like
Reactions: Saylick

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,053
3,537
126
Every review i am reading is sand bagging the 3090.
Every other review i am reading is saying a 3080 20gb is in the works.

Every review i read is killing me, because it is saying i should WAIT for more news on the 20gb because the 3090 is a box of fail for a gamer and not developer.
 
  • Like
Reactions: DeathReborn

coercitiv

Diamond Member
Jan 24, 2014
7,242
17,050
136
My problem with the undervolting claims is that, in my experience, it doesn't pan out as advertised. At least that was my experience on AMD cards when undervolting was all the craze. I tried it on multiple 480s and Vega. I was able to undervolt all of them but none of them undervolted as much as what I'd seen posted on the internet without running into stability issues and one of the 480s barely undervolted at all without major stability issues, I'm talking like <5% undervolt. My best undervolt card was probably Vega56 which I was able to get a decent undervolt on, but again, not nearly as low as I had seen others reportedly get. I could go that low and be stable in many games for hours, but then certain games would crash after an hour or two of gaming. If I only played those stable games, I'd never even know the undervolt wasn't fully stable.

Long story short, AMD and I'm sure Nvidia are setting the voltages where they are for a reason. Most likely because they need to in order to guarantee stability while yielding as many die as possible and even when reviewers test undervolt settings, unless they are checking for stability across many games checking for hours on each game, I just don't trust that they are actually stable, just like I don't trust reviewer CPU overclock stability as they typically don't really check long term stability due to time constraints.
Had the opportunity to undervolt one RX 560, two RX 580, two Vega 56. Each card was different. The small Polaris was spectacular in a sense, when I tried underclocking & undervolting I quicly reached a point where fans would shut down occasionally even under load. (Asus Strix model, good cooler with heatpipes). One of the Vegas got mixed results since memory didn't meet the "internet" goal for overclocking & undervolting. The other card, which I'm still using right now, runs with a very aggressive undervolt on both core (underclocked) and memory(overclocked). The second Vega is rock stable, the other one was always hit and miss under similar undervolting conditions.

Bottom line is what you described: undervolting is not a guarantee. Reviewers who show their cards sipping power can only help us estimate variability, there will always be the chips that need to be closer to stock voltage to run stable.

The "good news" for Ampere is this undervolting potential should always be there with a small compromise in clocks. With cards that are very efficient at stock, undervolting is hit or miss since you need to win the chip lottery. With cards that are pushed for those last 3-5% in performance, going back the frequency/voltage curve provides hefty power saving with minimal performance loss. But even for this we would need to see stock voltage/frequency curves, preferably after the driver fix.
 
  • Like
Reactions: Elfear

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
As far as I know none of the DC projects require very large amounts of VRAM but correct me if I'm wrong.

You are correct. I've never seen a DC task take up much VRAM. They're distinctly generated to be like that so they can be widely distributed to many different platforms.
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
I really question the stability of the 3080 claims though, because those wattage drops are gigantic.

Stability is also helped by better thermals and lesser voltage transients. NV simply pushed these cards too far. They require ~0.875V for 1900, yet getting pushed to 2000 boost clocks with voltages above 1V.

But stability could come in question in DL workloads, i wonder if those cards can really do nearly 30TF on 225W. Looking forward to testing one, as undervolted 1080TI was doing 10TF at same wattage. Insane efficiency gain if true versus Pascal.