Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 76 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

Gideon

Golden Member
Nov 27, 2007
1,774
4,145
136
I also already explicitly demonstrated that that was reported long before I used the term- in other words- there is no way I made it up.
IMO Nobody claimed that you made the term up. What you did thoug, is assign the to AMD as if anyone in the company communicated Big Navi in any being a "nvidia killer" which is obviously not true.

You could wriggle out you meant "viral marketing", but never assume malice when stupidity will suffice. Most probably the term just came from ovezelaous fanboys or clickbait miedia (which always has ridiculous titles). Still a very far cry from "poor volta" and AMD's terrible marketing of the past (which at least for a couple of years now seems to be histroy).
 
  • Like
Reactions: Elfear and Det0x

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
I think the top 2 cards are important to Nvidia because this is what creates mindshare. When anyone reads reviews about any GPU, they will always notice the top cards in benchmark charts, and which have been Nvidia for so long. This mindshare then filters down to lower cards even if they arent the best deal vs competition.

Ryzen never was on top of Gaming benchmarks and it has higher sales than Intel in desktop DIY market.
 

jpiniero

Lifer
Oct 1, 2010
15,223
5,768
136
Ryzen never was on top of Gaming benchmarks and it has higher sales than Intel in desktop DIY market.

Just goes to show that it's important to have proper supply out there. Guess that's one big advantage nVidia has with using Samsung, shouldn't be hard or expensive to get additional capacity should they need it.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
Just goes to show that it's important to have proper supply out there. Guess that's one big advantage nVidia has with using Samsung, shouldn't be hard or expensive to get additional capacity should they need it.

Top tier GPUs like the RTX3080Ti is a low volume product, and having big dies for lower tier GPUs is not helping with manufacturing. Ampere dies are big, we are not in the 500mm2 era for the top tier GPUs anymore.
Using more wafers even if at lower price than the competition still lowers your margins if you need more wafer volume for the same amount of dies.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
My prediction, AMD will compete up to RTX3080, that is the reason NV using A102 for RTX3080 this time.

Im expecting something similar to RTX2060 Super vs RX5700XT in price/perf across all the line of GPUs up to RTX3080.
NV will still have the 3080Ti unchallenged again but this time the competition will not stop at the $400 mark.
I can't see AMD really being competitive near the top end. The problem being the reason you own a 3080 or whatever is for ray tracing - that's going to be the "ultra" setting in games that you buy top end cards for. I can't see RNDA2 being competitive with Amphere there - Nvidia will make sure they're faster at straight ray tracing - this is their second gen ray tracing card, they have much more experience with it.

That's assuming AMD can somehow find a way to either compete with DLSS or manage to stop DLSS being used - without that it'll be pretty well impossible to compete across the whole range as it gives such huge performance boosts.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
I can't see AMD really being competitive near the top end. The problem being the reason you own a 3080 or whatever is for ray tracing - that's going to be the "ultra" setting in games that you buy top end cards for. I can't see RNDA2 being competitive with Amphere there - Nvidia will make sure they're faster at straight ray tracing - this is their second gen ray tracing card, they have much more experience with it.

That's assuming AMD can somehow find a way to either compete with DLSS or manage to stop DLSS being used - without that it'll be pretty well impossible to compete across the whole range as it gives such huge performance boosts.

Although RT is the future, games already with RT or games that will be released in 2021 with RT support could be counted in your 10 fingers.

So, no matter how strong NVIDIA will push the RT/DLSS marking in Ampere release, they know that only those two features are not enough against strong raster performance from the competition in 99% of the games out there.

Also to point out, AMD will use DXR for RayTracing (vs RTX) and DirectML (vs DLSS). So this time with RDNA2 AMD will not luck features like before with NAVI 10 release, making it even harder for NVIDIA not to want to compete in Raster performance. So im expecting that RDNA2 vs Ampere will be even tougher for NVIDIA than Navi 10 (RX5700XT) vs TU104 (RTX2060 Super) this time as both will have the same features.
 

DisEnchantment

Golden Member
Mar 3, 2017
1,747
6,598
136
IMO Nobody claimed that you made the term up. What you did thoug, is assign the to AMD as if anyone in the company communicated Big Navi in any being a "nvidia killer" which is obviously not true.
Reeks of made up garbage from a known YouTube rumor monger and/or his "sources". I visited his channel once but I lost lots of brain cells.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91

10? WoW, Minecraft and Cyberpunk are all pretty high profile too. It's not remotely close to being the majority of games, but a couple dozen isn't finger counting for any person I've ever met.

AMD will use DXR for RayTracing

NVidia uses DXR for ray tracing under DX, RTX is just marketing for their hardware implementation- Pascal parts can run DXR too.

IMO Nobody claimed that you made the term up.

Two people explicitly stated exactly that, it's not an opinion position :)
 

Det0x

Golden Member
Sep 11, 2014
1,299
4,234
136
Two people explicitly stated exactly that, it's not an opinion position :)

Read this post one more time.

You explicitly stated AMD made the "nVidia killer" claim here:
I don't work for AMD, so I'm not sure why you think I made that claim. They did not call it nVidia competitor or nVidia beater, they call it a nVidia killer
Multiple people have been trying to tell you that AMD didn't make the claim, but it seems like you don't want to accept that, why is that ?

Can you please tell/link me where AMD called it anything other then "big NAVI" or "NAVI 2X" ?
No you cant, because they never said anything of the sorts.
 
  • Like
Reactions: Elfear

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I can't see AMD really being competitive near the top end. The problem being the reason you own a 3080 or whatever is for ray tracing - that's going to be the "ultra" setting in games that you buy top end cards for. I can't see RNDA2 being competitive with Amphere there - Nvidia will make sure they're faster at straight ray tracing - this is their second gen ray tracing card, they have much more experience with it.

That's assuming AMD can somehow find a way to either compete with DLSS or manage to stop DLSS being used - without that it'll be pretty well impossible to compete across the whole range as it gives such huge performance boosts.

There is literally no way outside of industrial espionage that nVidia can make sure they are faster anywhere. They can make well educated decisions, but all of these decisions were made two years ago.

Also, why word your DLSS comment that way? "Somehow find a way" is rather ludicrous since DLSS is using DX12 features. nVidia has their software on top of that, but its not like what they are doing is some super secret magic and nobody knows how it works. Although it is the most over complicated up-scaler on earth.

And you need to keep in mind that a lot of people still dislike using DLSS. 2.0 is certainly better, but text and certain types of textures still look horrible, even compared to regular upscaling. Which BTW, regular upscaling has 15-20% better performance than DLSS 2.0. DLSS has a decent amount of overhead.
 
  • Like
Reactions: maddie

nurturedhate

Golden Member
Aug 27, 2011
1,767
773
136
Read this post one more time.

You explicitly stated AMD made the "nVidia killer" claim here:

Multiple people have been trying to tell you that AMD didn't make the claim, but it seems like you don't want to accept that, why is that ?

Can you please tell/link me where AMD called it anything other then "big NAVI" or "NAVI 2X" ?
No you cant, because they never said anything of the sorts.
A possible explanation would be someone's account was hacked and someone else made that post or someone forgot to switch accounts. The "nvidia killer" post does read far different than the rest and they don't seem to remember making it.
 

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
I can't see AMD really being competitive near the top end. The problem being the reason you own a 3080 or whatever is for ray tracing - that's going to be the "ultra" setting in games that you buy top end cards for. I can't see RNDA2 being competitive with Amphere there - Nvidia will make sure they're faster at straight ray tracing - this is their second gen ray tracing card, they have much more experience with it.

That's assuming AMD can somehow find a way to either compete with DLSS or manage to stop DLSS being used - without that it'll be pretty well impossible to compete across the whole range as it gives such huge performance boosts.
Not saying in any way that AMD will be better, but this argument is lacking. Everything as to design has been fixed a long time ago and I guess few remember the tesselation example where AMD was first but completely surpassed by Nvidia on their 1st attempt.
 

joesiv

Member
Mar 21, 2019
75
24
41
Can you please tell/link me where AMD called it anything other then "big NAVI" or "NAVI 2X" ?
No you cant, because they never said anything of the sorts.
AMD said 2x, I remembered seeing it somewhere, did a bit of digging, and found an AMD slidedeck from an investor presentation (slide 28):
The full list of presentations are here: https://ir.amd.com/events/event-details/financial-analyst-day-2020

Oh, and lisa has talked about "big navi" several times, that one is easy to find, like in Jan 2020 as she reiterated we'll see big navi in 2020.

I think the "Nvidia Killer" term was a supposed "leak" from an AMD employee to some youtuber, I wouldn't call that one conclusive, but makes a nice narrative!
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
8,113
6,768
136
Were those speeds for a particular product? If it's the base clock for the top end part with a massive number of shaders then 1700 MHz is really good.

The last Titan only had a base clock of 1350 MHz but could boost to over 1700 MHz. If we have a similar situation here then a 2100 MHz boost seems reasonable.
 
  • Like
Reactions: ozzy702

jpiniero

Lifer
Oct 1, 2010
15,223
5,768
136
It could actually be a good thing. A waterblock could potentially offer a huge performance advantage. Everyone else is like "ugh, 90c at 80% load" and I'd be like "yawn, 50c at 120% load"

I think it's just that the FE is going to be like an overclocked model, like the Lightning Z. There may not be much room beyond what the stock FE offers.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
It could actually be a good thing. A waterblock could potentially offer a huge performance advantage. Everyone else is like "ugh, 90c at 80% load" and I'd be like "yawn, 50c at 120% load"

They could go the AMD Fury route and offer a factory AIO cooler.

But a 20% OC would mean more like 40% more power dispersed. That would turn a 250W GPU into a 350W GPU. Or a 300W GPU into a 420W GPU. Put that 600W connector to work :p
 

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
If the rumours about the price of nvidia's new cooler are true, they really should just slap an AIO on it instead.

Don't worry man, it's going to be so expensive neither of us will care one way or another about whatever cooler they go with. I can't wait for a good laugh while I contemplate buying a GPU so expensive I'd have to trade in the family mini-van and take a second out on the house + an extra $50/mo to Edison.
 

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
2,996
126
12GB in a high-end part is piss-poor given we've had 11GB in consumer space for over 3 years. Especially considering this thing will almost certainly be horrifically overpriced.