Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 190 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
What evidence do you have to suggest they are supply limited? Ultimately yes there is not an infinite supply. But there isn't any evidence to suggest they won't have enough cards for a proper launch.

Evidence? None except that supply has been limited for most of 2020 and now AMD has to make 2 consoles, Zen3 chiplets and a full stack GPU launch all at the same time? How can they not be supply limited? A good hint is also then Ryzen 5000 series prices and SKU. High prices, the 5800x is especially overpriced, and missing 5700x and missing 5600. It's clear once supply gets better, these 2 SKUs will also become available at saner prices. With zen3 they could offer reasonable prices and still make big profit and take a lot of market share, if they had the supply for that.
On top of that for consoles they have contracts and every zen3 wafer simply is more profitable than a gpu wafer. So no incentive for a price war vs NV. especially since most likley all in all AMD has lower costs due to smaller dies and better yields even if TSMC wafers cost more.

No evidence. just logical deductions which of course can be completely wrong which I actually hope. I mean I would take 3080 level perfromance for $600 all day...but yeah let's remain realistic.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Evidence? None except that supply has been limited for most of 2020 and now AMD has to make 2 consoles, Zen3 chiplets and a full stack GPU launch all at the same time? How can they not be supply limited? A good hint is also then Ryzen 5000 series prices and SKU. High prices, the 5800x is especially overpriced, and missing 5700x and missing 5600. It's clear once supply gets better, these 2 SKUs will also become available at saner prices. With zen3 they could offer reasonable prices and still make big profit and take a lot of market share, if they had the supply for that.
On top of that for consoles they have contracts and every zen3 wafer simply is more profitable than a gpu wafer. So no incentive for a price war vs NV. especially since most likley all in all AMD has lower costs due to smaller dies and better yields even if TSMC wafers cost more.

No evidence. just logical deductions which of course can be completely wrong which I actually hope. I mean I would take 3080 level perfromance for $600 all day...but yeah let's remain realistic.

Just because they have a lot of products, doesn't mean they will be supply contained. AMD is the 3rd largest customer that TSMC has, and they are the single largest 7nm customer.

Ryzen 5000 series is priced how they are because they are the best CPU's on the market. There is no reason for AMD to undercut Intel when they have the upper hand. This in no way denotes supply issues.
 
  • Like
Reactions: lightmanek

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
There is no reason for AMD to undercut Intel when they have the upper hand. This in no way denotes supply issues.

There is one reason. Gain marketshare or outright pushing your competition out of the market. But again that only works if you have the supply. Even cheap ryzens would make more money per wafer than big navi.
 

AdamK47

Lifer
Oct 9, 1999
15,221
2,842
126
Vto8nGL.jpg
 

Head1985

Golden Member
Jul 8, 2014
1,864
689
136
3070TI on GA102 incoming to fight navi21XL/6800.Now we know why they cancel 3070 16GB and 3080 20GB-not good enough vs navi21.Btw last time nv launched x70 card on biggest SKU was during fermi days with GTX 570.NV must be desperate.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
3070TI on GA102 incoming to fight navi21XL/6800.Now we know why they cancel 3070 16GB and 3080 20GB-not good enough vs navi21.Btw last time nv launched x70 card on biggest SKU was during fermi days with GTX 570.NV must be desperate.

If RGT is right then 6800XT will match RTX 3080 at 4K and beat it at 1440p and 1080p. Same for RX 6900XT vs RTX 3090. RX 6800 will beat RTX 3070 soundly and thats why Nvidia want a GA102 based 3070 Ti . If AMD price RX 6800 at $500, RX 6800XT at $600 and RX 6900XT at $800 that would be fairly aggressive pricing.
 
  • Like
Reactions: lightmanek

Glo.

Diamond Member
Apr 25, 2015
5,710
4,553
136
If RGT is right then 6800XT will match RTX 3080 at 4K and beat it at 1440p and 1080p. Same for RX 6900XT vs RTX 3090. RX 6800 will beat RTX 3070 soundly and thats why Nvidia want a GA102 based 3070 Ti . If AMD price RX 6800 at $500, RX 6800XT at $600 and RX 6900XT at $800 that would be fairly aggressive pricing.
Also, now we hear that RTX 3060 is on 104 die, instead of 106, which might tell us the performance, we can expect from this product. Before - I couldn't believe that RTX 3060 would touch RTX 2080 performance, now - it will be easy.
 
  • Like
Reactions: lightmanek

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
If you add the relative performance off of the die used as opposed to the name it makes sense. Putting the 3060 on the 104 die is expected since the 3070 is pretty much a full die product so there's room for a more cut down chip on GA-104.

That basically makes the 3060 more like a 3070, just like the 3070 is what would have traditionally been a 3080 and so on up the line. The xx70 cards if the new generation have always been stacked up against the old xx80 cards of the prior generation and usually wind up beating them.

This does make the 3060 look a lot better, but it does present an issue that NVidia can't go back to the old system unless their next generation cards off an especially large leap in performance. I think that a move to 5nm will accomplish exactly that so it may just be a temporary aberration from their historical naming scheme.
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,699
136
So, if true where would this be priced? $600 would seem likely, but that would be the already questionable value of the 3070 even worse.

Videocardz is assume GDDR6X, but the tweet didn't mention that. Nvidia has said the dies can do either memory type, is there a reason to think a 3070 Ti wouldn't use 10GB GDDR6 on a 320 bit bus?
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
So, if true where would this be priced? $600 would seem likely, but that would be the already questionable value of the 3070 even worse.

Videocardz is assume GDDR6X, but the tweet didn't mention that. Nvidia has said the dies can do either memory type, is there a reason to think a 3070 Ti wouldn't use 10GB GDDR6 on a 320 bit bus?

I doubt the 3070 will have GDDR6X. Its too costly right now as nVidia is the only one using it. Its not even a ratified standard.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
The VRAM discussion here is so masterfully done, I can't tell if the VRAM complaints are real or just having fun? I haven't found anything I can't run smoothly, meaning without VRAM hitching, on my tv @4K with the 10GB on the 3080.
 
  • Like
Reactions: amenx

KompuKare

Golden Member
Jul 28, 2009
1,015
930
136
The VRAM discussion here is so masterfully done, I can't tell if the VRAM complaints are real or just having fun? I haven't found anything I can't run smoothly, meaning without VRAM hitching, on my tv @4K with the 10GB on the 3080.
I guess the question is: how often do you upgrade?
For regularly upgraders, performance here and now matters the most.
For those who keep cards longer, then the concern is that 10GB might [not] age well at all over the new console's lifetime.

Edit: added the missing [not]
 
Last edited:

sze5003

Lifer
Aug 18, 2012
14,182
625
126
I guess the question is: how often do you upgrade?
For regularly upgraders, performance here and now matters the most.
For those who keep cards longer, then the concern is that 10GB might age well at all over the new console's lifetime.
Exactly this. I've gotten accustomed to keeping my 1080ti 3 years now. I'd like to do the same with the next card if it's a high end model that will let me play at max settings like I have been used to doing.

If you plan to upgrade in a year the Vram topic is pointless.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
I guess the question is: how often do you upgrade?
For regularly upgraders, performance here and now matters the most.
For those who keep cards longer, then the concern is that 10GB might age well at all over the new console's lifetime.
Well said. For folks who upgrade every time a new Nvidia flagship is launched this does not matter. But for folks who want to keep it longer than 2 years the VRAM discussion is important, especially for 4K gaming.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,227
5,228
136
Well said. For folks who upgrade every time a new Nvidia flagship is launched this does not matter. But for folks who want to keep it longer than 2 years the VRAM discussion is important, especially for 4K gaming.

What most people expressing so much concern over VRAM capacity fail to acknowledge is that you will probably run into more raw performance issues as your card ages, than VRAM issues. So you keep a card for multiple generations you will be turning settings down for acceptable performance. It's just a fact of life.

Somehow it's only a problem to adjust setting for VRAM performance issues, not raw performance issues? This is hypocritical.

As I posted in this thread before. A 2080Ti is already dipping to unacceptable performance at 4K max setting and it isn't from VRAM, it's from raw performance capability, so more VRAM isn't going to future proof you.
https://forums.anandtech.com/thread...ulation-thread.2572510/page-182#post-40305049
 
Last edited:
  • Like
Reactions: amenx

undertaker101

Banned
Apr 9, 2006
301
195
116
What makes you think you will get 16gb amd card for msrp
You will get any card for msrp if you wait, point was what the default configuration would be for the top Radeon, and there are good indicators it will be 16 gb. Techspot did an evaluation of the 2060 6gb in Jan 2019 and concluded at the time 6gb could max everything out back then at 1440p, not true today, even at 1440p. I have little doubt that with consoles pushing 16 gb, 10 gb will be a limiting factor sooner than most folks realize.
 

uzzi38

Platinum Member
Oct 16, 2019
2,632
5,958
146
What most people expressing so much concern over VRAM capacity fail to acknowledge is that you will probably run into more raw performance issues as your card ages, than VRAM issues. So you keep a card for multiple generations you will be turning settings down for acceptable performance. It's just a fact of life.

Somehow it's only a problem to adjust setting for VRAM performance issues, not raw performance issues? This is hypocritical.

As I posted in this thread before. A 2080Ti is already dipping to unacceptable performance at 4K max setting and it isn't from VRAM, it's from raw performance capability, so more VRAM isn't going to future proof you.
https://forums.anandtech.com/thread...ulation-thread.2572510/page-182#post-40305049

Dropping textures easily has the greatest IQ impact out of all settings (excluding resolution). People don't want to tone down texture quality because of that. Whereas with other settings, usually just dropping from Ultra to High settings can net you nearly the same visuals with significant performance uplifts in the region of 15-20%.

What do I mean by "nearly the same visuals"? Well, more often than not, the hit to visuals is the distance at which effects or details load in, and Ultra settings usually have these cranked to levels far beyond what you will actually notice in game. Or it could be a slight drop to the resolution of certain reflections or shadows - but the point remains that turning those down a setting or two has a much lesser visual impact than turning down texture quality in many games, and this is why people are so bothered about VRAM capacity.

Next generation consoles coming with 16GB RAM for 4K or just as likely ~1440p upscaled to 4K games are a bigger worry in that regard.