• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 148 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DiogoDX

Senior member
Oct 11, 2012
706
162
116
To be clear, nVidia ROPs and AMD ROPs are not equivalent. They ultimately do the same job, but aren't 1:1 the same.

And NOBODY had 64 ROPs 10 years ago. The first GPU to ever feature 64 ROPs was in 2014 with GM204 (GTX-980).
290X in 2013.
 

tviceman

Diamond Member
Mar 25, 2008
6,733
514
126
www.facebook.com
Big Navi is not just a big Navi10, rdna2 is a change of architecture. So no linear scaling isnt necessarily relevant
I hope so! If AMD can bring good competition I'll give them my money. I'm waiting until early next year to upgrade anyways, so the only x-factor in a highly competitive GPU fight will be how many games are supporting / announced support for DLSS by then.
 

Konan

Senior member
Jul 28, 2017
360
291
106
Out of the 4 titles shown on the DF vid, the average of the uplift vs the 2080ti is 35%

All the titles chosen were to definitely hamstring the 8gb 448gbs 2080 memory, nevertheless the performance improvement is impressive.

The difference between the 3080 and the 2080 was 35% for control with RT and DLSS on and off ie the RT performance between the 3080 and 2080 was 1:1, which seems to indicate perhaps a limitation or bottleneck somewhere in that bench maybe. Hardware unboxed noticed the same thing.


We really have no idea where AMD is at with regards to their development and performance of RDNA2. So we can't rule one way or the other.

We have seen a openvr bench of an AMD GPU 30% faster than a 2080ti, so there is that

Anyway, 3080 at 699 is a pretty good deal tbh. Hope AMD gives us even more
Can you please link me where they compare to a 2080 TI? They never use a 2080Ti......

Yes as I have pointed out the math typically shows that 2 X 5700XT is about + 30% (33%) a 2080Ti, And I’ve also said there is nothing stopping that benchmark that showed up being a 400 W nuclear reactor engineering sample.

I am very much looking forward to Hardware Unboxed review.
 

DDH

Member
May 30, 2015
167
168
111
Can you please link me where they compare to a 2080 TI? They never use a 2080Ti......

Yes as I have pointed out the math typically shows that 2 X 5700XT is about + 30% (33%) a 2080Ti, And I’ve also said there is nothing stopping that benchmark that showed up being a 400 W nuclear reactor engineering sample.

I am very much looking forward to Hardware Unboxed review.
You can get all the 2080 benchmark numbers from gamegpu for each game and setting. The narrator lists the % uplift of the 3080. From that you can establish fps, then compare to 2080ti. Sottr was the worst with 22%, doom eternal was the best with 44%
 

Konan

Senior member
Jul 28, 2017
360
291
106
You can get all the 2080 benchmark numbers from gamegpu for each game and setting. The narrator lists the % uplift of the 3080. From that you can establish fps, then compare to 2080ti. Sottr was the worst with 22%, doom eternal was the best with 44%
Oh you are doing manual calculations..... and estimates that weren't in the video (from some unknown other source). Yeah... best to wait for Hardware Unboxed reviews.

let’s not forgot on pre release drivers too.
 
Last edited:

xpea

Senior member
Feb 14, 2014
395
108
116
Only people not long in the game can really be that enthusiastic about this...

It's only great because the last 5 years have seen extreme price gouging on both sides. AMD couldn't deliver more and was happy NV didn't price them out of the market and on top of that we had consoles with very outdated CPU and mediocre GPU.

Now we get very good consoles and AMD is in the game again. Competition. That is what is lowering prices not NVs choice of process tech. Look at NV margins how they grew during the last 5 years. The high prices weren't due to finfets they were to make more money.
Nice speech but reality is different. Facts:
- Pre-Finfet wafer price was about 3k at TSMC
- TSMC 7nm wafer is now between 7 to 9k depending on EUV and cells options
- In the last 5 years, Nvidia margin increased from 45 to 60% due to the boom of datacenter business, not from gaming division.

Ampere is cheaper because :
- 30% price reduction Nvidia got from Samsung compared to TSMC,
- and yes you are right, Nvidia must minimize the new console impact on their geforce business

Personally, I think it has very few to do with upcoming AMD DGPUs. Fans talk a lot but RDNA2 is not a thread to Ampere. RDNA2 is basically a Turing V2 without tensor cores and less RT hardware. DLSS is missing, RT perf is barely above Turing, only power efficiency is good at the optimum voltage/speed but who knows what stupid thing AMD will do to compete with Ampere...
 
  • Like
Reactions: ozzy702

xpea

Senior member
Feb 14, 2014
395
108
116
Perhaps, but dont forget that GA102 is even larger at 627mm2 that makes yields even lower, increasing the cost further.
So even if SS 8N wafers are cheaper, yields on the 627mm2 die will be lower.

NVIDIA now will be forced to sell a bigger die (627mm2) to compete with a smaller die (500mm2).
We should all realize that SS8, at least for big die must have terrible parametric yield.
Not relevant. Nvidia buys wafers at Samsung with a guaranteed % of good dies. At the wafer scale, Nvidia is 30% cheaper than AMD with TSMC (with same die size of course)
 

xpea

Senior member
Feb 14, 2014
395
108
116
Like 350W stupid? How dare AMD to launch a card with such high power. No one will accept.:rolleyes:
you can bring a 350W card to the fight. It's not the problem.
The issue is when this 350W card is obliterated by the 220W competitive model.
In other words, if you hold the performance crown, nobody cares about power consumption
 

Stuka87

Diamond Member
Dec 10, 2010
5,488
1,277
136
Nice speech but reality is different. Facts:
- Pre-Finfet wafer price was about 3k at TSMC
- TSMC 7nm wafer is now between 7 to 9k depending on EUV and cells options
- In the last 5 years, Nvidia margin increased from 45 to 60% due to the boom of datacenter business, not from gaming division.

Ampere is cheaper because :
- 30% price reduction Nvidia got from Samsung compared to TSMC,
- and yes you are right, Nvidia must minimize the new console impact on their geforce business

Personally, I think it has very few to do with upcoming AMD DGPUs. Fans talk a lot but RDNA2 is not a thread to Ampere. RDNA2 is basically a Turing V2 without tensor cores and less RT hardware. DLSS is missing, RT perf is barely above Turing, only power efficiency is good at the optimum voltage/speed but who knows what stupid thing AMD will do to compete with Ampere...
nVida got the wafers for cheaper than TSMC, but we do not know the percentage. We also don't know what AMD pays per wafer. AMD is TSMC's 4th largest customer, and as such, they get a better deal than smaller customers. PLUS, lets not forget what is unarguably the most expensive reference cooler ever. This also eats into margins.

As for RDNA2 performance, we do not know the RT performance. It s been manually calculated to be higher than Ampere in theory, but nobody knows how well it, or ampere, will perform. And we don't have any benchmarks for either.

As for tensor cores, why on earth is this an issue? Tensor cores have extremely limited use cases, especially in gaming cards.

And DLSS, again, who knows? AMD literally has not talked about anything. AND, even if they have it, and it matches what nVidia has, big deal. People buying low end cards may care about it, but I know if I was spending $700 on a video card, I would be pissed if I had to screw up image quality in order to play a game smoothly.
 

Hitman928

Diamond Member
Apr 15, 2012
3,684
4,142
136
Not relevant. Nvidia buys wafers at Samsung with a guaranteed % of good dies. At the wafer scale, Nvidia is 30% cheaper than AMD with TSMC (with same die size of course)
I'm not usually one for internet rumors but Tweaktown posted an article saying basically the exact opposite of what you are claiming.

 

xpea

Senior member
Feb 14, 2014
395
108
116
I'm not usually one for internet rumors but Tweaktown posted an article saying basically the exact opposite of what you are claiming.

I don't see any contradiction. They talk about supply issue. I talk about purchasing deal. They are different things.
But Tweaktown article has false information and anyone working in the industry will tell you that...
Another source had something much more damning to say, but I want to flesh that out before I write it. For now, I'm being told stock will be extremely low for the next couple of months. Why? Samsung 8nm yields are unknown at this point, NVIDIA might not want to make too many before the yields improve.
... is pure profanity deleted :eek::eek::eek:
Nvidia has on-site engineers working at Samsung and TSMC fabs daily monitoring the yields. They know down to the 3 digits what's going on. Seriously, a multi-billion dollar fab running without knowing the yields :tearsofjoy:

No profanity in the tech boards.

AT Moderator ElFenix
 
Last edited by a moderator:

Hitman928

Diamond Member
Apr 15, 2012
3,684
4,142
136
I don't see any contradiction. They talk about supply issue. I talk about purchasing deal. They are different things.
But Tweaktown article has false information and anyone working in the industry will tell you that...

... is pure :eek::eek::eek:
Nvidia has on-site engineers working at Samsung and TSMC fabs daily monitoring the yields. They know down to the 3 digits what's going on. Seriously, a multi-billion dollar fab running without knowing the yields :tearsofjoy:
Since the process they are using has some amount of customization, if Nvidia is really pushing the launch to get ahead of AMD, I could see a situation where the yield for Ampere isn't fully known yet, especially since Samsung hasn't taped out anything near Ampere's size or complexity to date on 8nm/10nm that I am aware of. To say completely unknown for sure wouldn't be accurate, but I guess that's just not how I interpreted it.

I also don't fully agree with your guaranteed % good die statement but maybe I'm just misinterpreting it. Good die is dependent on design as well as the actual process yield. The fab will guarantee a certain defect density and wafer skew across corners, but it's up to the design team to meet yield targets based upon that data.


Profanity deleted from the quote.

AT Moderator ElFenix
 
Last edited by a moderator:

moonbogg

Lifer
Jan 8, 2011
10,197
2,034
126
I'm excited about the 3080. A 3080Ti would have more ram and be faster, but I'm pretty confident it would also be more expensive. I can't imagine it being $700. With my resolution of 3440x1440 and the X80 card being powerful this round, spending any more than $700 when the 3080 exists doesn't sound exciting or worth it to me. The 3080 does sound exciting to me though, and the 3070 is exciting as HELL for so many people they are just flipping out. I have a good feeling about this release. It seems that Nvidia did us right this time. I'm sure the consoles and AMD had something to do with it, but still. Even without the consoles and AMD, it makes sense to offer their customers an upgrade. Turing had nothing for us unless you bought a 2080Ti or a 1660Ti/Super.
 

beginner99

Diamond Member
Jun 2, 2009
4,862
1,255
136
- 30% price reduction Nvidia got from Samsung compared to TSMC,
citation needed. You are using that number all over the thread without a single credible source (I doubt it exists) so I have to assume you pulled it out of a very dark place.

At best it helps NV to keep their margins higher but without the new powerful consoles and AMD being competitive, this price correction would not have happened and in fact the correction was overdue anyways because just like intel was for a long time, NV was it's own competition. They needed a big performance/$ improvement to get their existing users to update.

Datacenter increasing margin, yes makes sense but not that much. Plus on some level the huge dies NV is making are an artifact of that because many of these transistors are simply there for the datacenter/workstation use and not gaming. I'm pretty sure dlss was born from the fact they needed to make use of these else useless transistors for gaming.
 

Veradun

Senior member
Jul 29, 2016
564
780
136
I'm excited about the 3080. A 3080Ti would have more ram and be faster, but I'm pretty confident it would also be more expensive.
This basically depends on sales, as seen or forseen after AMD launch their stuff and MS and Sony do the same.

I can see scenarios in which the very obvious 3080Ti that will come with 12gigs will be a replacement for the 3080 in nvidia's lineup, with 3080 getting price cut accordingly.
 

AtenRa

Lifer
Feb 2, 2009
13,606
2,696
126
Not relevant. Nvidia buys wafers at Samsung with a guaranteed % of good dies. At the wafer scale, Nvidia is 30% cheaper than AMD with TSMC (with same die size of course)
Since we are not talking about the same die sizes here what you are saying is irrelevant.

nVida got the wafers for cheaper than TSMC, but we do not know the percentage. We also don't know what AMD pays per wafer. AMD is TSMC's 4th largest customer, and as such, they get a better deal than smaller customers. PLUS, lets not forget what is unarguably the most expensive reference cooler ever. This also eats into margins.
AMD is number ONE 7nm customer at TSMC currently with 30K wpm , im sure they got a very nice deal.
 

Capt Caveman

Lifer
Jan 30, 2005
34,547
651
126
Thinking I will go 3090 FE. Really interested in a EVGA 3080 Ti 20gb but it won't come out for year and most likely cost over $1k. And at that price, I matter as well go 3090.
 

ASK THE COMMUNITY