Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 164 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
More so than that, they wanted better pricing from TSMC, so started going with Samsung to try and get TSMC to keep nVidia's business at a lower cost to nVidia. TSMC said "nope". Which left nVidia stuck with Samsung. They did manage to get enough 7nm capacity (at full price) for A100. But the consumer cards all ended up on Samsung. But I don't think they ever intended to use 7EUV, as I don't think its compatible with 8nm process they ended up with (Which is a modified 10nm process).

We have so many active Youtube rumor channels now, with clear cash incentive to quickly fill any knowledge void, with specious stories, that you really can't just believe the various tales surrounding this.

We will never really know why they switched. It was almost certainly some boring combination of technical and financial reasons. It doesn't really matter.
 

CakeMonster

Golden Member
Nov 22, 2012
1,389
496
136
If I understand correctly, the reviews go live in not too long, then ~1 day later the 3080s will be available to order, and then in another week the 3090s will be available to order?
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
We have so many active Youtube rumor channels now, with clear cash incentive to quickly fill any knowledge void, with specious stories, that you really can't just believe the various tales surrounding this.

We will never really know why they switched. It was almost certainly some boring combination of technical and financial reasons. It doesn't really matter.

Yeah, too many people live by what YT clickbait has to say and eat it up then spew it out as truth.
 

traderjay

Senior member
Sep 24, 2015
220
165
116

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,117
136

Its a monster miner!

-Oh god please no...
 

CP5670

Diamond Member
Jun 24, 2004
5,510
588
126
As the comments say, the power usage would make it useless for mining despite its high hash rate. Although I expect them to be marked up for a few weeks anyway with scalpers and general high demand.
 

Head1985

Golden Member
Jul 8, 2014
1,864
688
136
  • Love
Reactions: psolord

CakeMonster

Golden Member
Nov 22, 2012
1,389
496
136
From the previous hype as far as I recall it didn't matter whether mining was profitable, lots of people still bought cards for mining because their parents or employer paid the power bill and they were betting on coin prices going up forever anyway. Common sense is not a factor in many cases.
 

sze5003

Lifer
Aug 18, 2012
14,182
625
126
Benches and reviews look good for the 3080. Not expecting much more from 3090 in fact either but I just want the extra vram and I don't want to upgrade for a few more years either.
 

EVI

Junior Member
Apr 20, 2020
4
11
51
Many people were concerned that the 3080 series and above would be bottlenecking PCIe 3.0. Initial tests are looking like a ~1% difference from PCIe 3.0 to PCIe 4.0, so not a major issue on the GPU side just yet.

 
  • Like
Reactions: Konan

blckgrffn

Diamond Member
May 1, 2003
9,123
3,058
136
www.teamjuchems.com
I might be the only one, but I would find it interesting if it had an efficiency switch on it - like a 250W setting that didn’t require software.

I wish my 5700xt had one too, like 180W or something similar. The difference in performance is super negligible and I just see less boosting over 2Ghz but the thermals and fan work is so much milder. Making sure the power limit feels like babysitting and I normally notice it not being enforced when I am in game and I am not willing to stop playing to dork with.

Based on one of these reviews, the 3080 is basically twice as fast as the 5700xt at 4K but as tested power is approaching 370W. I guess there is some great engineering to make that all work but 2x performance at 2x power budget isn’t doing it for me.

1.9x perf at like 1.5x power, now we are talking. It seems to be only a setting away. I continued to be baffled by nvidias approach on this.

It feels like a 270W 3080 would have left more room for a 320W 3080S later.
 

Bouowmx

Golden Member
Nov 13, 2016
1,138
550
146
TechPowerUp review says undervolting is not possible, but undervolting still works the same way as Turing: editing the voltage-frequency curve.

Stock: 345 W
Undervolt to 0.806 V, 1800 MHz: 293 W
Performance in The Division 2 (frame rate): Identical

 
  • Like
Reactions: Konan

A///

Diamond Member
Feb 24, 2017
4,352
3,154
136
JHH lies aside, I'm not terribly convinced I should spend the money on a 3080 now. I'd find it remarkable if AMD somehow nudged up along all the models including the 3090, for a slightly lower price and less power usage, allowing room for OC or higher performing models later on.

Of course it's AMD so they'll find a way to mess this up.

I might be the only one, but I would find it interesting if it had an efficiency switch on it - like a 250W setting that didn’t require software.

I wish my 5700xt had one too, like 180W or something similar. The difference in performance is super negligible and I just see less boosting over 2Ghz but the thermals and fan work is so much milder. Making sure the power limit feels like babysitting and I normally notice it not being enforced when I am in game and I am not willing to stop playing to dork with.

Based on one of these reviews, the 3080 is basically twice as fast as the 5700xt at 4K but as tested power is approaching 370W. I guess there is some great engineering to make that all work but 2x performance at 2x power budget isn’t doing it for me.

1.9x perf at like 1.5x power, now we are talking. It seems to be only a setting away. I continued to be baffled by nvidias approach on this.

It feels like a 270W 3080 would have left more room for a 320W 3080S later.
Two schools of thought here.

1. Doesn't AMD software have a power option for usage? I know Nvidia control panel's got a selector for power use during various states. I have mine set on bare minimum for everything. Performance is noticeable, unfortunately.

2. In regard to Supers, one would hope, maybe pray, that by the time Supers come along, Samsung 8N will have matured leading to lower clocks, or the Supers may be delivered on a more efficient and mature node. IDK.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Its not profitable.ETH is at 360usd and it will be soon POS so it cannot be mined anymore.

That's hybrid PoW/PoS. Full PoS isn't coming for 2-3 years, barring any delays.

Although yea I wouldn't spend $699 for Eth mining either. It'd take 10 months to reach ROI, at current state. Why would I do that when my already paid off triple RX 470/570s do that already?

Nvidia did the same as AMD with Vega. Pushed the card over the perf/watt curve to meet the target performance.

You can reduce the power limit to 290W and have minimal performance loss. Actually on the 3000 reviews thread, its 5% impact at 270W.
 
Last edited:

Ajay

Lifer
Jan 8, 2001
15,431
7,849
136
I look forward to the first round of units catching fire.
Given the peak power usage on the 3080, I just can't fathom how much the 3090 will actually be using - no wonder the card is so huge.

Heh, just thought of this. A 10900K + 3090 is going to need a minimum 1000W PSU. Reminds me of the old days running SLI and HEDT CPUs.
 
Last edited: