• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 164 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Stuka87

Diamond Member
Dec 10, 2010
5,024
766
126
Can you tell me why in brief please.
They wanted to use SS7 EUV, but yields weren't good enough. Presumably they just took the SS8 design that is being used for GA107/GA108 and scaled it up, rather than say porting it to TSMC which would mean next year.
More so than that, they wanted better pricing from TSMC, so started going with Samsung to try and get TSMC to keep nVidia's business at a lower cost to nVidia. TSMC said "nope". Which left nVidia stuck with Samsung. They did manage to get enough 7nm capacity (at full price) for A100. But the consumer cards all ended up on Samsung. But I don't think they ever intended to use 7EUV, as I don't think its compatible with 8nm process they ended up with (Which is a modified 10nm process).
 

guidryp

Senior member
Apr 3, 2006
356
159
116
More so than that, they wanted better pricing from TSMC, so started going with Samsung to try and get TSMC to keep nVidia's business at a lower cost to nVidia. TSMC said "nope". Which left nVidia stuck with Samsung. They did manage to get enough 7nm capacity (at full price) for A100. But the consumer cards all ended up on Samsung. But I don't think they ever intended to use 7EUV, as I don't think its compatible with 8nm process they ended up with (Which is a modified 10nm process).
We have so many active Youtube rumor channels now, with clear cash incentive to quickly fill any knowledge void, with specious stories, that you really can't just believe the various tales surrounding this.

We will never really know why they switched. It was almost certainly some boring combination of technical and financial reasons. It doesn't really matter.
 

CakeMonster

Senior member
Nov 22, 2012
953
60
91
If I understand correctly, the reviews go live in not too long, then ~1 day later the 3080s will be available to order, and then in another week the 3090s will be available to order?
 

DooKey

Golden Member
Nov 9, 2005
1,618
267
126
We have so many active Youtube rumor channels now, with clear cash incentive to quickly fill any knowledge void, with specious stories, that you really can't just believe the various tales surrounding this.

We will never really know why they switched. It was almost certainly some boring combination of technical and financial reasons. It doesn't really matter.
Yeah, too many people live by what YT clickbait has to say and eat it up then spew it out as truth.
 

traderjay

Member
Sep 24, 2015
197
142
116

GodisanAtheist

Platinum Member
Nov 16, 2006
2,366
760
136

Its a monster miner!
-Oh god please no...
 

CP5670

Diamond Member
Jun 24, 2004
4,459
44
91
As the comments say, the power usage would make it useless for mining despite its high hash rate. Although I expect them to be marked up for a few weeks anyway with scalpers and general high demand.
 

CakeMonster

Senior member
Nov 22, 2012
953
60
91
From the previous hype as far as I recall it didn't matter whether mining was profitable, lots of people still bought cards for mining because their parents or employer paid the power bill and they were betting on coin prices going up forever anyway. Common sense is not a factor in many cases.
 

uzzi38

Golden Member
Oct 16, 2019
1,055
1,580
96
So the 3080 seems to be exactly as rumoured. Roughly 30% over the 2080Ti. Nice.

What's less nice however is this:
1600267762235.png
 

sze5003

Lifer
Aug 18, 2012
12,987
245
106
Benches and reviews look good for the 3080. Not expecting much more from 3090 in fact either but I just want the extra vram and I don't want to upgrade for a few more years either.
 

EVI

Junior Member
Apr 20, 2020
2
9
41
Many people were concerned that the 3080 series and above would be bottlenecking PCIe 3.0. Initial tests are looking like a ~1% difference from PCIe 3.0 to PCIe 4.0, so not a major issue on the GPU side just yet.

 
  • Like
Reactions: Konan

blckgrffn

Diamond Member
May 1, 2003
7,087
367
126
www.teamjuchems.com
I might be the only one, but I would find it interesting if it had an efficiency switch on it - like a 250W setting that didn’t require software.

I wish my 5700xt had one too, like 180W or something similar. The difference in performance is super negligible and I just see less boosting over 2Ghz but the thermals and fan work is so much milder. Making sure the power limit feels like babysitting and I normally notice it not being enforced when I am in game and I am not willing to stop playing to dork with.

Based on one of these reviews, the 3080 is basically twice as fast as the 5700xt at 4K but as tested power is approaching 370W. I guess there is some great engineering to make that all work but 2x performance at 2x power budget isn’t doing it for me.

1.9x perf at like 1.5x power, now we are talking. It seems to be only a setting away. I continued to be baffled by nvidias approach on this.

It feels like a 270W 3080 would have left more room for a 320W 3080S later.
 

Bouowmx

Golden Member
Nov 13, 2016
1,028
404
116
TechPowerUp review says undervolting is not possible, but undervolting still works the same way as Turing: editing the voltage-frequency curve.

Stock: 345 W
Undervolt to 0.806 V, 1800 MHz: 293 W
Performance in The Division 2 (frame rate): Identical

 
  • Like
Reactions: Konan

A///

Senior member
Feb 24, 2017
769
516
106
JHH lies aside, I'm not terribly convinced I should spend the money on a 3080 now. I'd find it remarkable if AMD somehow nudged up along all the models including the 3090, for a slightly lower price and less power usage, allowing room for OC or higher performing models later on.

Of course it's AMD so they'll find a way to mess this up.

I might be the only one, but I would find it interesting if it had an efficiency switch on it - like a 250W setting that didn’t require software.

I wish my 5700xt had one too, like 180W or something similar. The difference in performance is super negligible and I just see less boosting over 2Ghz but the thermals and fan work is so much milder. Making sure the power limit feels like babysitting and I normally notice it not being enforced when I am in game and I am not willing to stop playing to dork with.

Based on one of these reviews, the 3080 is basically twice as fast as the 5700xt at 4K but as tested power is approaching 370W. I guess there is some great engineering to make that all work but 2x performance at 2x power budget isn’t doing it for me.

1.9x perf at like 1.5x power, now we are talking. It seems to be only a setting away. I continued to be baffled by nvidias approach on this.

It feels like a 270W 3080 would have left more room for a 320W 3080S later.
Two schools of thought here.

1. Doesn't AMD software have a power option for usage? I know Nvidia control panel's got a selector for power use during various states. I have mine set on bare minimum for everything. Performance is noticeable, unfortunately.

2. In regard to Supers, one would hope, maybe pray, that by the time Supers come along, Samsung 8N will have matured leading to lower clocks, or the Supers may be delivered on a more efficient and mature node. IDK.
 

IntelUser2000

Elite Member
Oct 14, 2003
7,095
1,628
136
Its not profitable.ETH is at 360usd and it will be soon POS so it cannot be mined anymore.
That's hybrid PoW/PoS. Full PoS isn't coming for 2-3 years, barring any delays.

Although yea I wouldn't spend $699 for Eth mining either. It'd take 10 months to reach ROI, at current state. Why would I do that when my already paid off triple RX 470/570s do that already?

Nvidia did the same as AMD with Vega. Pushed the card over the perf/watt curve to meet the target performance.
You can reduce the power limit to 290W and have minimal performance loss. Actually on the 3000 reviews thread, its 5% impact at 270W.
 
Last edited:

ASK THE COMMUNITY