Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 146 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
No, they dont get better prices. Why should they? TSMC has no competition so AMD is paying whatever TSMC wants.

Not sure if you are joking or not here,

When you negotiate a deal with someone to buy more products than its other customers , you get better prices than what you had before.
Its simple, as JHH was saying

The more you buy , the less you pay :D
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Tend to agree with others, AMD has limited wafer allocation, with that:
1) they have to make lots of console chips, this must be in the contract.

for the rest do you:
a) make lots of cpus which are already ahead of their competitor performance wise, and their competitor is still busy shooting itself in the foot.
b) make gpu's which you are currently behind your competitor in performance and features and your competitor just released a new range with one of the biggest performance uplifts ever.

It's not rocket science, they will make cpu's. They'll do just enough gpu's to stay in the market.
 
  • Like
Reactions: Konan and ozzy702

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
citation needed. You are using that number all over the thread without a single credible source (I doubt it exists) so I have to assume you pulled it out of a very dark place.

At best it helps NV to keep their margins higher but without the new powerful consoles and AMD being competitive, this price correction would not have happened and in fact the correction was overdue anyways because just like intel was for a long time, NV was it's own competition. They needed a big performance/$ improvement to get their existing users to update.

Datacenter increasing margin, yes makes sense but not that much. Plus on some level the huge dies NV is making are an artifact of that because many of these transistors are simply there for the datacenter/workstation use and not gaming. I'm pretty sure dlss was born from the fact they needed to make use of these else useless transistors for gaming.
The Reddit Nvidia Q&A had this interesting statement from an Nvidia rep.

Ray tracing denoising shaders are good examples that might benefit greatly from doubling FP32 throughput.

So much for needing the Tensor cores for this.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
Tend to agree with others, AMD has limited wafer allocation, with that:
1) they have to make lots of console chips, this must be in the contract.

for the rest do you:
a) make lots of cpus which are already ahead of their competitor performance wise, and their competitor is still busy shooting itself in the foot.
b) make gpu's which you are currently behind your competitor in performance and features and your competitor just released a new range with one of the biggest performance uplifts ever.

It's not rocket science, they will make cpu's. They'll do just enough gpu's to stay in the market.

AMD needs approximately 5k wafers for 1 Million XBOX SX Console Chips (360mm2) (~200 good dies per wafer) per month.
H2 2020 AMD has 30K 7nm wafers in TSMC, being the number one 7nm customer.
Also to note that not many have realized, AMD increased its inventory by ~30% the last quarter (Q2 2020) to 1.3B and they had 6 months lead until the new Console launch in December 2020.

So, they have enough wafer capacity for their CPUs, GPUs and new Console chips.

On the other hand, NVIDIA with a 627mm2 Ampere GA102 chip found in RTX3080 and RTX3090 can have less than 50 good dies per wafer.
That means they need 4x times the wafer volume for the same amount of chips. For this I dont see a lot of RTX3080/3090 cards at retail stock in the first 2-3 months after Ampere launch.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
nVidia has made it clear that their gaming business is growing 25% Q-Q. They will have enough supply to hit this target. Demand on the other hand...
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
No, they dont get better prices. Why should they? TSMC has no competition so AMD is paying whatever TSMC wants.

Apparently you are unaware of this thing called 'economies of scale'. If you buy one of something, you pay X price. If you buy a thousand of something, you pay LESS than if you bought one. If you buy tens of thousands of something, with a long term order, you pay less than either of the previous situations. TSMC WANTS people to place large, long term orders, as such, they give better deals to their large, long term customers.

Or TSMC is just giving the wafer allocation to nVidia and co. Maybe AMD can go back to GlobalFoundrys...

Huh? Seriously, what does this even mean?
 
Last edited:

Head1985

Golden Member
Jul 8, 2014
1,867
699
136

3080 running 4k resolution at roughly 40% - 50% faster avg FPS compared to the 2080Ti.....

F in chat for anyone who bought a 2080Ti in the last few months
3080 is the card to get this time.3070 and 3090 looks like crap compare to it.
With 3090 you pay 2x more for like 10-15% more perf?
With 3070 it will be 40-50% slower and have less Vram for only 200USD less.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
So if Nvidia sells out within minutes it's because of demand?

Just ask yourself one question....Is it love or just a school boy crush?

Yes? Look up what the "Osborne effect " is. 3080 is more than twice as fast as the 5700XT which was sold last year for >$400 with 8GB. The demand will be there.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
Yes? Look up what the "Osborne effect " is. 3080 is more than twice as fast as the 5700XT which was sold last year for >$400 with 8GB. The demand will be there.

So you really have no idea in the end. Your just guessing that supply will be plentiful while demand will be greater.

Why would you bring up the 5700XT? Pretty much anybody that has one and want's to sell it now can get most of their money back thanks to the mining craze.

On another note....How's the resale value on the 2080Ti? I'm guessing you didn't mention that for a reason.

About the only credit I'll give you is your consistently biased towards Nvidia.
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
8,113
6,768
136
It definitely looks like the 3080 is going to be a great performance per dollar card. The memory might rear it's ugly head a few years down the road, but considering where both the 3070 and 3090 land performance-wise and what they cost it's hard to consider anything else if you're going to go with NVidia.

Even if it's a power hog it's going to be winter soon and it can double as a space heater. :p
 
Feb 4, 2009
35,284
16,766
136
It definitely looks like the 3080 is going to be a great performance per dollar card. The memory might rear it's ugly head a few years down the road, but considering where both the 3070 and 3090 land performance-wise and what they cost it's hard to consider anything else if you're going to go with NVidia.

Even if it's a power hog it's going to be winter soon and it can double as a space heater. :p

Way out of my price range but I agree.
What I don’t understand is why is it so difficult to get more memory on cards lately?
Seems like an 8GB card is rare and 10GB seems like a weird number.
Why is it so difficult to have 8GB be the gaming normal and 16GB be the halo product?
I understand there may not be a use case for that memory but the market appears to want it.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,109
136
Not sure if you are joking or not here,

When you negotiate a deal with someone to buy more products than its other customers , you get better prices than what you had before.
Its simple, as JHH was saying

The more you buy , the less you pay :D
In the past, TSMC had something like a bidding process on wafer allocation priority. Apple always won because they would pay more and buy risk wafers, which helped TSMC refine their process.
 

Mopetar

Diamond Member
Jan 31, 2011
8,113
6,768
136
Way out of my price range but I agree.
What I don’t understand is why is it so difficult to get more memory on cards lately?
Seems like an 8GB card is rare and 10GB seems like a weird number.
Why is it so difficult to have 8GB be the gaming normal and 16GB be the halo product?
I understand there may not be a use case for that memory but the market appears to want it.

I think it just comes down to using GDDR6X and having to choose between 1 GB or 2 GB modules. So either you get 10 which feels too small or you pay a lot more for 20 GB which seems a bit more than you'd need for the life of the card. If they could get someone to make some 1.5 GB modules and have 15 GB that would hit a perfect sweet spot

In the past, TSMC had something like a bidding process on wafer allocation priority. Apple always won because they would pay more and buy risk wafers, which helped TSMC refine their process.

That's not that much different than any other business. If you want more of their supply just offer to pay more. It discourages inefficient use.

Apple can sell their devices at a premium and pass on any extra cost. Making smaller SoCs is also more tolerable while a process is still working out the kinks and has a higher defect rate. If you're making massive GPU dies they have to be for the enterprise market where they can sell for thousands or tens of thousands of dollars instead of hundreds.
 
Feb 4, 2009
35,284
16,766
136
I think it just comes down to using GDDR6X and having to choose between 1 GB or 2 GB modules. So either you get 10 which feels too small or you pay a lot more for 20 GB which seems a bit more than you'd need for the life of the card. If they could get someone to make some 1.5 GB modules and have 15 GB that would hit a perfect sweet spot



That's not that much different than any other business. If you want more of their supply just offer to pay more. It discourages inefficient use.

Apple can sell their devices at a premium and pass on any extra cost. Making smaller SoCs is also more tolerable while a process is still working out the kinks and has a higher defect rate. If you're making massive GPU dies they have to be for the enterprise market where they can sell for thousands or tens of thousands of dollars instead of hundreds.

Ah so they all need to be the same size, no doing 6x2GB and 4x1GB for 16GB total.
 

sze5003

Lifer
Aug 18, 2012
14,246
639
126
Way out of my price range but I agree.
What I don’t understand is why is it so difficult to get more memory on cards lately?
Seems like an 8GB card is rare and 10GB seems like a weird number.
Why is it so difficult to have 8GB be the gaming normal and 16GB be the halo product?
I understand there may not be a use case for that memory but the market appears to want it.
Well one of the reasons for leaving out such things and creating a big gap between the cards is so they can charge the prices they want. Compared to a 1080ti it's only 1gb less though so is that really a big deal?

I guess that depends on the person and how they game or want to use the card.
 

Martimus

Diamond Member
Apr 24, 2007
4,488
153
106
Way out of my price range but I agree.
What I don’t understand is why is it so difficult to get more memory on cards lately?
Seems like an 8GB card is rare and 10GB seems like a weird number.
Why is it so difficult to have 8GB be the gaming normal and 16GB be the halo product?
I understand there may not be a use case for that memory but the market appears to want it.
It has to do with the GDDR6X memory they are using. It currently only has half the chip memory capacity of GDDR6 memory. This will likely be rectified as time goes on, but it made it difficult for Nvidia to put large memory sizes on their new GPUs. Also, Nvidia usually differentiates their professional products from their gaming products with memory capacity, so that is likely another reason.
 

CakeMonster

Golden Member
Nov 22, 2012
1,502
659
136
Everything points to there being room for another card between 10Gb 3080 and 24Gb 3090. Maybe the performance gap is a bit too narrow (remains to be seen), but that has not stopped them in the past. So yeah, count on it.

HOWEVER be careful about basing your purchasing decision on getting something that is not confirmed to exist and does not have a launch date. For every post I read about someone saying they need the upgrade but have decided to wait for the "3080Ti", I worry about them not having considered the whole picture from a price/value prospective considering how long they may have to wait.

Take the 2080Ti, not a good value price/performance wise, BUT you had the top card for 2 years which sweetened the deal a whole lot. The problem was you couldn't predict that beforehand, as there was always a yearly upgrade before 2018. If there is a 2 year gap now, waiting for a 3080Ti that launches in 6 months is a good deal. If there is a 1 year gap, it would not be a very good deal. If a 3080Ti launches in 12 months, with a 2 year gap for the next top card, eh, I guess its still decent but that's 12 months lost performance if the slowness of your current card is already bothering you.

I'm NOT saying "just buy it", I don't want to get accused of that. But be aware of the implications of the value of staying with your old hardware while betting on something we don't even know will exist and not even knowing the time frame of its value.
 

Konan

Senior member
Jul 28, 2017
360
291
106
Rumor I'm hearing is a 20GB 3080 model is still real and we may know more in October. Also G6X is quite a bit more expensive (~$14-20 1/GB) so would expect if the 20GB model is coming for it to be $100-$200 more than a 3080 10GB.