Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 66 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,522
3,037
136
10,2 billions xtors (32% more) and only ~20% faster with ~25% higher power consumption than GTX1080. Navi is already on 7nm. And now people claiming here that nVidia would have no chance with Ampere.
Based on this Link and this Link
5700XT vs 1080 is ~27% faster and power consumption is 32% higher, transistor count is 43% higher(10.3 vs 7.2 billion transistors).
Let's compare 5700 vs 1080
5700XT vs 1080 is ~13% faster and power consumption is the same, transistor count is 43% higher(10.3 vs 7.2 billion transistors).
It's true they used too much transistors for not so high performance gain so from this point It could be considered as "fail" and power consumption is also not so good considering the advantage of 7nm.
People claiming how Ampere will stand no chance against Navi2 is pure hype based on too little known info. Nvidia would need to f**k up big time for this to happen.
 
  • Like
Reactions: Lodix and Konan

TESKATLIPOKA

Platinum Member
May 1, 2020
2,522
3,037
136
I dont even have to comment on that

relative-performance_2560-1440.png
He was talking about Nvidia TITAN Xp and not the cut down 1080Ti, but even so the difference in performance wouldn't be 20% but more likely 10-15% and considering It has only 16.5% more transistors then yes they achieved parity after 2.5 years. Let's be honest, Pascal was and still is a great architecture.
 
Last edited:

Konan

Senior member
Jul 28, 2017
360
291
106
What people here don't seem to realise is that it's not RDNA2 that's the real danger to Nvidia. The next generation will be close but that's all. It's AMD's swift cadence and +50% perf/W each generation promise that is a real danger.

A cadence shift like this is not easy at all.

Been there. Heard it before.

Retrospective quote from PC Gamer:

There are many changes with Navi, the first new GPU architecture for AMD graphics cards since its RX Vega line in 2017, but after the official reveal of specs and pricing at E3 2019, I have to wonder if this is truly new or simply the next iteration on the existing product line. After many years of playing second string to Nvidia's leading parts, I wanted a clear win from AMD. Navi, sadly, isn't it. AMD claims IPC performance improvements of 25 percent per CU, and says that overall performance per watt will be 50 percent better than its previous generation Vega and Polaris architectures.

That sounds great, but when you dig into the details, it looks like at best AMD might match the performance of Nvidia's RTX 2060 and 2070 GPUs, except AMD will use more power, potentially cost a bit more, and Navi doesn't include any ray tracing or deep learning features. Considering AMD is using TSMC's latest 7nm manufacturing process, it's a bit of a letdown—a lot like the Fiji, Polaris, and Vega GPUs of the past several years.

I wonder how much is the same for this forthcoming time around.
This “+50% perf/W each generation promise“ Sounds great and is definitely a decent KPI to aim for. But the reality has been hasn’t really gotten them anywhere at the medium/high to high end. No Ray tracing, no deep learning features. They are way behind on software + SDK. I have some serious confidence issues but I hope to be proven wrong on the software side with.
Navi 1x was supposed to launch early 2018 it was late, way late. I don’t believe any capacity or capability to move any quicker from them based on demonstrated lateness at every new node or uArch change. Also Navi 2X does not mean 2*5700XT.
 

Konan

Senior member
Jul 28, 2017
360
291
106
Apparently at this point in time... the Engineering Sample RTX 3080 these guys see show 20% + performance over a RTX 2080Ti and the final version a bit more, say 22-25% I think if that is what they mean.
So rumoured TGP of what, 300-320W for up to 25% more (3080>2080Ti) that may put a potential RTX 3070 between a 2080 Super and 2080Ti then?

 
Last edited:
Mar 11, 2004
23,285
5,724
146
re: 300 W, the official spec for the 3080 Ti might be 300 W but the power draw being talked about is what the FE is rated for, which is probally closer to 350.



Twice the transistors compared to Pascal maybe, not Turing.

Yeah, did people forget that there's been multiple cards (from both companies) that easily exceeded their power spec? It was getting so bad that sites like Anandtech had to basically start explaining how both were gaming their power specs.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Been there. Heard it before.

Retrospective quote from PC Gamer:



I wonder how much is the same for this forthcoming time around.
This “+50% perf/W each generation promise“ Sounds great and is definitely a decent KPI to aim for. But the reality has been hasn’t really gotten them anywhere at the medium/high to high end. No Ray tracing, no deep learning features. They are way behind on software + SDK. I have some serious confidence issues but I hope to be proven wrong on the software side with.
Navi 1x was supposed to launch early 2018 it was late, way late. I don’t believe any capacity or capability to move any quicker from them based on demonstrated lateness at every new node or uArch change. Also Navi 2X does not mean 2*5700XT.

AMD wasn't aiming for the high end with RDNA 1. Its a stepping stone to RDNA 2. And RDNA 1 did meet the 50% Perf:Watt over Polaris.

RayTracing was always going to be with RDNA 2, as it was being implemented along with Sony and MS. RayTracing as it is now is basically a tech demo. The few people that have a 2080Ti can play games at 1080P and almost sustain 60fps. But it means basically nothing to everybody else. As for deep learning, meh. DLSS is basically an extremely complex way of doing up scaling. And compared to upscale+sharpen filter, in some very specific cases DLSS 2 looks a bit better, but in other places looks significantly worse. Especially when it involves text.

Navi 1x was also said to be debuted in early 2019. No place did AMD say it was going to be out two years ago.

There are several Navi 2x chips. One of them is a chip that is significantly larger than a 5700 XT that is expected to be a 3080Ti competitor.
 

Konan

Senior member
Jul 28, 2017
360
291
106
AMD wasn't aiming for the high end with RDNA 1. Its a stepping stone to RDNA 2. And RDNA 1 did meet the 50% Perf:Watt over Polaris.

RayTracing was always going to be with RDNA 2, as it was being implemented along with Sony and MS. RayTracing as it is now is basically a tech demo. The few people that have a 2080Ti can play games at 1080P and almost sustain 60fps. But it means basically nothing to everybody else. As for deep learning, meh. DLSS is basically an extremely complex way of doing up scaling. And compared to upscale+sharpen filter, in some very specific cases DLSS 2 looks a bit better, but in other places looks significantly worse. Especially when it involves text.

Navi 1x was also said to be debuted in early 2019. No place did AMD say it was going to be out two years ago.

There are several Navi 2x chips. One of them is a chip that is significantly larger than a 5700 XT that is expected to be a 3080Ti competitor.

There was definitely a delay in Navi's launch. If it was originally supposed to launch in H1 2018, it was pushed back to H2 2019. See picture below.
It's particularly interesting to hear Navi's story, was it originally intended for N14 node and then moved forward to N7, was simply too late.
So when people pop up saying "Oh AMD will move to 5nm next and get to market quicker"" etc etc I don't believe it. It hasn't happened for years. They don't have the experience or proven track record to execute quickly. But I wish they did.

I have serious doubts that Navi 2X is two times Navi. On the slide in question the X is capitalized where a times is usually lowercase. Additionally, considering that the current Navi is "Navi 10" and "Navi 14", which can be summarized as "Navi 1x", It’s really simple to assume that the next GPUs are "Navi 20" and "Navi 30", so the x stands for "any number", not "x-times improvement". 2X is the generational code name for all consumer-oriented non-semi custom RDNA 2 silicon, with each piece of silicon then having a distinct second digit. This is what it is about not a rumour of it will be 2 times the performance. Note: Hopefully it comes out with 2x the performance but best not to board the hype train.
The performance/ watt has been a consistent marketing story but it still hasn't got them the high end. All I am saying is that I don't think they will get the high end this time around either.

As for DLSS 2.0, it works great with great FPS least in my experience. AMD needs something here and they don't have it.


11.jpg
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The few people that have a 2080Ti can play games at 1080P and almost sustain 60fps.


In synthetic tests, then you get actual games and the 2060 is hitting 60.

Why do we even discuss AMD in Nvidia thread?

Because more than half of this thread is filled with people basing their predictions off of what they think will be a good matchup between AMD and nVidia. If you just look at the logic end there simply isn't going to be this tight race unless nVidia screws up enormously.

That ad turned out to be 100% true for gamers

So the 1060 was better than the 5700xt? Seriously?
 

Konan

Senior member
Jul 28, 2017
360
291
106
MS DirectML

It isn't accurate to compare DirectML to DLSS. Direct ML is a Microsoft tool. It's neither AMD specific nor AMD made. It doesn't serve the purpose. Direct ML is hardware agnostic.
DLSS is the end result of a well designed and a well inferred neural network. DirectML is just a set of tools made to design something like DLSS. There is no like-for-like.

For AMD to get something like DLSS to work they have to do the following:
1. Get their Machine Learning driver stack working properly
2. Gain market share in AI and actually develop some neural network from scratch. That means all the data extraction layers AND inference layers.
3. Implementing said neural network for gaming AND productivity.
4. Technically RDNA1 can use DirectML and so can Turing and Ampere. I wonder why there is nothing there...(I assume not well at all)

The only reason Nvidia can come up with DLSS is due to their high investment into AI for a long time. It trickles down to gaming, not the other way around.

DLSS 1.0 came, it was a start and was pretty crap. DLSS 2.0 is how it should have been from the start. Requiring devs to update their game was a terrible idea. The rumoured DLSS 3.0 with Ampere is something I am looking forward to.
 
  • Like
Reactions: ozzy702

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
In synthetic tests, then you get actual games and the 2060 is hitting 60.

"Hitting" 60 is not what I said. In that link you posted, BF5 was at 46fps min, 48fps if you exclude 99th percentile. TR was at a wopping 28/35 fps. The 2080Ti was the only card to eek out a minimum of more than 60fps. Which follows along with exactly what I said. Ray Tracing on Turing is basically worthy of tech demos. Down the road it will eventually be the norm, but thats a long ways off.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91

The 5700xt fails to hit 60 minimum in Metro at 1080p without ray tracing(but oddly does at 1440). BFV the 5700 fails to hit 60 minimum at 1080p- are you going to claim that the 5700 isn't any good for 1080p gaming? And the 5700xt isn't sometimes?

Speaking as someone who has actually played through these games(Metro, TR and Q2) on a 2060 with ray tracing on the comments about needing a 2080Ti come across as absurd and ignorant.
 
  • Like
Reactions: xpea

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
[/URL]


Speaking as someone who has actually played through these games(Metro, TR and Q2) on a 2060 with ray tracing on the comments about needing a 2080Ti come across as absurd and ignorant.

You pledge loyalty to your king, but support him by buying his lowest offering? I guess your wallet is thankful your mouth only does the talking.


Calling someone a fanboy using different word choice isn't allowed either.

AT Moderator ElFenix
 
Last edited by a moderator:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I buy what makes sense, 12nm offered a very small increase for a new node so I was going to wait for big 7nm parts which is what I'm doing.

It's going into a 3800x system(that would be AMD)- if AMD can shock me and best nVidia, mainly in RT performance, I'd have no problems going all red for this build. How many people in this thread can honestly say the same?
 

DiogoDX

Senior member
Oct 11, 2012
747
279
136
I buy what makes sense, 12nm offered a very small increase for a new node so I was going to wait for big 7nm parts which is what I'm doing.

It's going into a 3800x system(that would be AMD)- if AMD can shock me and best nVidia, mainly in RT performance, I'd have no problems going all red for this build. How many people in this thread can honestly say the same?
I had a 5970 ($700) and 7970 ($550) before going to nvidia with 980Ti ($700) and my current 1080Ti ($700). If AMD delivers on performance with no stupid $1200 2080Ti price I will go back to them.
 
  • Like
Reactions: Konan and Elfear
Mar 11, 2004
23,285
5,724
146

In synthetic tests, then you get actual games and the 2060 is hitting 60.



Because more than half of this thread is filled with people basing their predictions off of what they think will be a good matchup between AMD and nVidia. If you just look at the logic end there simply isn't going to be this tight race unless nVidia screws up enormously.



So the 1060 was better than the 5700xt? Seriously?

Some reason you cut my username out of the quote? Were you intentionally trying to make it seem like the other person you quoted said it or is it because you can't handle having your pathetic trolling called out? Jar Jar Binks could handle your mind "tricks" if that's all you've got to offer. Even the P&N clowns run circles around your discourse and they regularly get tripped up by simple words.

I couldn't say as I don't know sales figures (although I'd assume the 1060 has outsold the 5700XT). And I would absolutely stand by it if more people gamed on 1060s as that would almost certainly mean more gaming enjoyment was had on those cards. Doesn't mean they had a better experience than people that games on 5700XT, which you somehow seem to be implying I said for some reason, which is obviously completely stupid thing for you to try and do. Yet here we are.

I'll give you a 2 second golf clap for trying to bring back the old tRollo days of AT, I don't doubt you're almost as nostalgic for those days as you are for good Star Wars movies. I do love how wound up that ad still has people (its what almost 4 years old now?). Must really stick in your craw that it turned out to be 100% true since there were never any Volta gaming cards ever even released (Titan V was clearly a pro card). But then we have at least one thread dedicated to that issue so I recommend you not waste any more of your own time trying to derail this thread whining about such.
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
[* quote *] [/* quote *]

As stated previously I type quotes out, I reply to points, not people. The rest of your post.... This is a tech forum. The 1650 is not better than the 2080ti, the 570 is not better than the 5700xt.

"Poor Volta" was wildly dishonest hype. You want too bring up dishonest hype from nVidia as a counter by all means, there was quite a bit of it in the Fermi timeframe, but nothing that they couldn't hit three years later.

Trying to say an i3 is better than Threadripper is just ignorance.

If AMD delivers on performance with no stupid $1200 2080Ti price I will go back to them.
.

I have no problem if it is $1200 if they can deliver superior to 3080ti performance, mainly with RT.
 

Krteq

Senior member
May 22, 2015
993
672
136
"Poor Volta" was wildly dishonest hype. You want too bring up dishonest hype from nVidia as a counter by all means, there was quite a bit of it in the Fermi timeframe, but nothing that they couldn't hit three years later.
Hmm, let me remind you some:
  • Pascal 10x Maxwell
  • Turing 6x Pascal
  • "It just works"™
  • etc.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Not seeing any comparisons to a competitors product there, personally I do view that as very different.

If Intel says 30% faster than the previous Xeon it's markedly different than claiming 30% faster than Epyc. In the cases where they compare to their own product they tend to have some cherry picked hyper specific way of backing up their claims. When you talk about a competitors products it's very different to me at least.

If nVidia says the 3080Ti is 100% faster than the 2080Ti- they probably cherry picked some RT numbers. If they say it's 100% faster than big Navi they better be averaging that at least. Fundamentally different IMO. I'm not saying nVidia has not behaved like that in the past- they and 3Dfx used to go hard at each other, just haven't seen it recently.
 

Det0x

Golden Member
Sep 11, 2014
1,299
4,234
136
Last edited:
  • Like
Reactions: Elfear and Mopetar