Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 184 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

KentState

Diamond Member
Oct 19, 2001
8,397
393
126
With the stock MSI BIOS the core clocks would be between 1785 and 2040 because it's bouncing off the power limit. Now with the STRIX BIOS it stays between 1950 and 2040.

Which Strix bios did you flash? I see two different versions listed on Techpowerup.

94.02.26.C0.13

94.02.26.C0.16

Both have the same date and clock rates.
 

CP5670

Diamond Member
Jun 24, 2004
5,511
588
126
Got the STRIX BIOS working on my 3090 MSI Gaming X Trio. Did a few hours of stress testing with it. +100 voltage, +123 Power, +75 Core, and +650 memory.

Have you tried using 4K at 120hz with gsync on that TV? The subsampling issues are fixed now but there are apparently still some issues with gsync at high framerates, not sure if it's a problem on the Nvidia or LG side.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
I think Nvidia knows AMD is going to make both of these cards look ridiculous, so they are holding back the big volume for the 20GB cards. I'd expect them to beat AMD to the punch with a 20GB release or come right after with a price cut or something. I think RDNA2 does have something to do with the major supply "issues".

What good is a 20 GB card really going to do? At best it just eases the concerns of anyone who was worried that 10 GB wouldn't be enough for the long term. There's a limit to how much more than can charge for that because unless there's a game that is hitting a VRAM limit already, the frame rate won't improve at all and the value per dollar starts diminishing fast.

We also know that the cards are at the limits of how much the clocks can be pushed, so don't expect any improvements there unless all of the initial cards are the worst silicon they had. I can't see NVidia pulling a move like that because there aren't a lot of ways you could piss off your most loyal and biggest customers more than something like that. Any 20 GB 3080 either has the same number of cores, or doesn't get too much of a boost from any extra ones that are enabled because we know what the 3090 performs like and can infer performance gains any extra cores on a 3080 would net.

If AMD beats NVidia with what they have on offer now, they don't have anything else to counter with, at least not in a meaningful way. Sure they could always make something on TSMC 7nm by diverting wafers from GA100, but that's just eating into a high margin product in order to offer a limited supply of a lower margin product that will stop anyone from wanting to buy your existing products on Samsung 8nm. Even if they would go that route, they're at least six months away from being able to launch such a product and couldn't tell anyone about it for fear of Osborneing their existing cards.

Even if AMD weren't going to launch until February, I still think it was a good idea to hold of no the 3070 launch so that retailers have more cards in stock at launch. People will be more quick to forget about the limited stock for the 3080 and the 3090 if they pull off a good launch for the 3070. Repeating the same mistake again just reminds everyone of the initial problems and makes people wonder if there's a pattern forming and how long that it might last.
 
  • Like
Reactions: lobz

AdamK47

Lifer
Oct 9, 1999
15,215
2,839
126
Have you tried using 4K at 120hz with gsync on that TV? The subsampling issues are fixed now but there are apparently still some issues with gsync at high framerates, not sure if it's a problem on the Nvidia or LG side.

The downsampling problem (not being able to get chroma 4:4:4) was with with the CX. I have the C9. The C9 would lose signal when GSync was enabled with 4K 120Hz. I had to set the TV up for engineering mode in order to get the latest firmware. That along with the new Nvidia drivers and a shorter HDMI cable fixed the GSync issues. 4K at 120Hz with GSync is crisp and buttery.
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Funny little note from Larian in todays Baldur's Gate 3 hotfix notes. And should note, this is a GameWorks game that had some nVidia backing.

Graphical quirks on Nvidia 3080 cards have also been fixed. Fun story: because the card was so wildly popular, we couldn’t source any to test with prior to launch. We still don’t have one. At this point, we’re pretty sure you guys made up the RTX 3080. Prove it exists by sending one to us.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
Funny little note from Larian in todays Baldur's Gate 3 hotfix notes. And should note, this is a GameWorks game that had some nVidia backing.

Hmmm, there's a reason websites and YouTube channels have gone out of business for biting the hand that feeds them.

Not saying they don't have a right to, but sometimes discretion is the better part of valor.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Hmmm, there's a reason websites and YouTube channels have gone out of business for biting the hand that feeds them.

Not saying they don't have a right to, but sometimes discretion is the better part of valor.

Yeah, but at the same time I can understand their frustration. They are being told their game is constantly crashing on a video card that they do not have, and cannot get.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Yeah, but at the same time I can understand their frustration. They are being told their game is constantly crashing on a video card that they do not have, and cannot get.

They can't get it because Nvidia is waiting to flood the market with 20GB cards in response to AMD. The 3080 should have never had 10GB. It was a mistake, full stop. Nvidia can't be aiming to put 10GB cards in too many of their customer's hands right now considering people will be pissed when AMD drops a similar card with 16GB of ram for the same or less money. In a few months, everyone will forget that 10GB card ever existed, and it won't matter anyway because no one will have one to complain about.
 

CP5670

Diamond Member
Jun 24, 2004
5,511
588
126
The downsampling problem (not being able to get chroma 4:4:4) was with with the CX. I have the C9. The C9 would lose signal when GSync was enabled with 4K 120Hz. I had to set the TV up for engineering mode in order to get the latest firmware. That along with the new Nvidia drivers and a shorter HDMI cable fixed the GSync issues. 4K at 120Hz with GSync is crisp and buttery.

Good to hear. I guess they will have that firmware in the main release channel soon.
 
  • Like
Reactions: ozzy702

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
They can't get it because Nvidia is waiting to flood the market with 20GB cards in response to AMD. The 3080 should have never had 10GB. It was a mistake, full stop. Nvidia can't be aiming to put 10GB cards in too many of their customer's hands right now considering people will be pissed when AMD drops a similar card with 16GB of ram for the same or less money. In a few months, everyone will forget that 10GB card ever existed, and it won't matter anyway because no one will have one to complain about.

If this turns out true, Nvidia would have been better off releasing the 3080 with the full memory controller operational and 12gb of slightly slower vram to get the same memory bandwidth.

A 12gb card wouldn't have come with any stigma or backlash of being insufficient. Oh well too late now.
 

sze5003

Lifer
Aug 18, 2012
14,182
625
126
They can't get it because Nvidia is waiting to flood the market with 20GB cards in response to AMD. The 3080 should have never had 10GB. It was a mistake, full stop. Nvidia can't be aiming to put 10GB cards in too many of their customer's hands right now considering people will be pissed when AMD drops a similar card with 16GB of ram for the same or less money. In a few months, everyone will forget that 10GB card ever existed, and it won't matter anyway because no one will have one to complain about.
I'm waiting for this specifically. My friend just ordered a fully built pc just so he can get a 3080. The justification was that his kids need a pc and he can give his current one to them. I can easily wait a few months to see what happens. AMD sure is taking their time.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
They can't get it because Nvidia is waiting to flood the market with 20GB cards in response to AMD. The 3080 should have never had 10GB. It was a mistake, full stop. Nvidia can't be aiming to put 10GB cards in too many of their customer's hands right now considering people will be pissed when AMD drops a similar card with 16GB of ram for the same or less money. In a few months, everyone will forget that 10GB card ever existed, and it won't matter anyway because no one will have one to complain about.

This is possibly correct, but talk about screwing over your most loyal customers.
 

Elfear

Diamond Member
May 30, 2004
7,097
644
126
Interesting analysis by Hardware Unboxed on the Ampere architecture. He's emphasizing again that the "doubling" of Ampere Cuda cores doesn't start to shine until you hit 4k resolutions. He states that the FP32 workload at 4k is higher than at lower resolutions and that the vertex and triangle load is identical at 1440p and 4k (which is why the performance increase vs Turing at 1440p isn't as impressive as at 4k). He also shows that a CPU bottleneck does account for the less than stellar resolution scaling for some games and some resolutions, but is only a partial answer (the doubled up of FP32 performance being the other part).

3080 vs 2080Ti.jpg
 
  • Like
Reactions: Konan and ozzy702

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136

The only thing in the article I partially disagree with is the stated reason for supply issues being yields. I'm convinced Nvidia just threw a small 10GB stone in the bush to see what would come running out. When it turned out to be a duel-sword-wielding ninja bear charging right at them they knew they screwed up, so now it's back to the original 20GB "variant" it was always supposed to be. The RTX 3080 hasn't been released yet. Its release date is apparently December.
 

linkgoron

Platinum Member
Mar 9, 2005
2,293
814
136
The only thing in the article I partially disagree with is the stated reason for supply issues being yields. I'm convinced Nvidia just threw a small 10GB stone in the bush to see what would come running out. When it turned out to be a duel-sword-wielding ninja bear charging right at them they knew they screwed up, so now it's back to the original 20GB "variant" it was always supposed to be. The RTX 3080 hasn't been released yet. Its release date is apparently December.

A 3080 20GB would basically kill the 3090. Obviously, those who buy the titan/whatever cards would buy it anyway, but I don't see most paying double (or almost-double, depending on the 20GB's price) for just ~10% more performance. When the 3090 also had significantly more memory, it was a real consideration - but for a relatively minor performance boost? I'm not so sure.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
A 3080 20GB would basically kill the 3090. Obviously, those who buy the titan/whatever cards would buy it anyway, but I don't see most paying double (or almost-double, depending on the 20GB's price) for just ~10% more performance. When the 3090 also had significantly more memory, it was a real consideration - but for a relatively minor performance boost? I'm not so sure.

Nvidia can choose between killing the 3090 or letting AMD kill the 3080. Their lineup is in triage and they must decide which to save. I'm certain they will save the 3080 and offer the 3090 a lollipop to enjoy before it dies an embarrassing death.
 
  • Like
Reactions: VirtualLarry

Heartbreaker

Diamond Member
Apr 3, 2006
4,227
5,228
136
Nvidia can choose between killing the 3090 or letting AMD kill the 3080. Their lineup is in triage and they must decide which to save. I'm certain they will save the 3080 and offer the 3090 a lollipop to enjoy before it dies an embarrassing death.

20GB 3080 isn't going to change much. It will be $200+ option, that most 3080 buyers will ignore, because $200 more will give you ZERO performance boost.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
20GB 3080 isn't going to change much. It will be $200+ option, that most 3080 buyers will ignore, because $200 more will give you ZERO performance boost.

Alternatively, people ignore both overpriced 3080s and get a 16GB AMD card that's 5% slower for $200-$400 less. If Nvidia charges $900 for the 3080 20GB ($1000+ realistically) and keeps the 10GB variant at $700, then AMD will absolutely eat their lunch. Nvidia can't afford to be Intel right now (old Intel, that is). They have to compete or they will LOSE.
 
  • Like
Reactions: VirtualLarry

linkgoron

Platinum Member
Mar 9, 2005
2,293
814
136
20GB 3080 isn't going to change much. It will be $200+ option, that most 3080 buyers will ignore, because $200 more will give you ZERO performance boost.
Of course it will. The main thing that a 3090 provides is significantly more memory. We've already had a thread about the 10GB on the 3080, and we've also seen a lot of talk if it's future proof or not, not to mention that it's a "downgrade" for users who are coming from 11GB 2080ti or 11GB 1080ti. Users who considered a 3090 for more memory, will just get a 3080 20GB.

In addition, a cheaper and competitive 16GB AMD card will totally ride on the "more future proof" card, especially when AMD is in both new major consoles.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Of course it will. The main thing that a 3090 provides is significantly more memory. We've already had a thread about the 10GB on the 3080, and we've also seen a lot of talk if it's future proof or not, not to mention that it's a "downgrade" for users who are coming from 11GB 2080ti or 11GB 1080ti. Users who considered a 3090 for more memory, will just get a 3080 20GB.

In addition, a cheaper and competitive 16GB AMD card will totally ride on the "more future proof" card, especially when AMD is in both new major consoles.

Yep, that's a thing too. People might feel that games were designed for RDNA2 and will be optimized for AMD. They might be right.
 
  • Like
Reactions: VirtualLarry

Heartbreaker

Diamond Member
Apr 3, 2006
4,227
5,228
136
Of course it will. The main thing that a 3090 provides is significantly more memory. We've already had a thread about the 10GB on the 3080, and we've also seen a lot of talk if it's future proof or not, not to mention that it's a "downgrade" for users who are coming from 11GB 2080ti or 11GB 1080ti. Users who considered a 3090 for more memory, will just get a 3080 20GB.

In addition, a cheaper and competitive 16GB AMD card will totally ride on the "more future proof" card, especially when AMD is in both new major consoles.

This is only going to matter to a minority of people. Paying $200 more for future proof that actually isn't, is a fools wager.
 

sze5003

Lifer
Aug 18, 2012
14,182
625
126
This is only going to matter to a minority of people. Paying $200 more for future proof that actually isn't, is a fools wager.
If they want a more capable GPU with more RAM, they won't have to pay more. They can buy RDNA2.
Coming from a 1080ti, paying for a 20gb 3080 is something I don't mind doing because I'm going to keep it for 3-4 years. I was considering a 3090 but at the aftermarket prices, it's pointless seeing performance not being much better.

I would go with AMD but I have a gsync monitor.
 
  • Like
Reactions: ozzy702 and the901