• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 164 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 
More so than that, they wanted better pricing from TSMC, so started going with Samsung to try and get TSMC to keep nVidia's business at a lower cost to nVidia. TSMC said "nope". Which left nVidia stuck with Samsung. They did manage to get enough 7nm capacity (at full price) for A100. But the consumer cards all ended up on Samsung. But I don't think they ever intended to use 7EUV, as I don't think its compatible with 8nm process they ended up with (Which is a modified 10nm process).

We have so many active Youtube rumor channels now, with clear cash incentive to quickly fill any knowledge void, with specious stories, that you really can't just believe the various tales surrounding this.

We will never really know why they switched. It was almost certainly some boring combination of technical and financial reasons. It doesn't really matter.
 
If I understand correctly, the reviews go live in not too long, then ~1 day later the 3080s will be available to order, and then in another week the 3090s will be available to order?
 
We have so many active Youtube rumor channels now, with clear cash incentive to quickly fill any knowledge void, with specious stories, that you really can't just believe the various tales surrounding this.

We will never really know why they switched. It was almost certainly some boring combination of technical and financial reasons. It doesn't really matter.

Yeah, too many people live by what YT clickbait has to say and eat it up then spew it out as truth.
 

Its a monster miner!

-Oh god please no...
 
As the comments say, the power usage would make it useless for mining despite its high hash rate. Although I expect them to be marked up for a few weeks anyway with scalpers and general high demand.
 
From the previous hype as far as I recall it didn't matter whether mining was profitable, lots of people still bought cards for mining because their parents or employer paid the power bill and they were betting on coin prices going up forever anyway. Common sense is not a factor in many cases.
 
So the 3080 seems to be exactly as rumoured. Roughly 30% over the 2080Ti. Nice.

What's less nice however is this:
1600267762235.png
 
Benches and reviews look good for the 3080. Not expecting much more from 3090 in fact either but I just want the extra vram and I don't want to upgrade for a few more years either.
 
Many people were concerned that the 3080 series and above would be bottlenecking PCIe 3.0. Initial tests are looking like a ~1% difference from PCIe 3.0 to PCIe 4.0, so not a major issue on the GPU side just yet.

 
I might be the only one, but I would find it interesting if it had an efficiency switch on it - like a 250W setting that didn’t require software.

I wish my 5700xt had one too, like 180W or something similar. The difference in performance is super negligible and I just see less boosting over 2Ghz but the thermals and fan work is so much milder. Making sure the power limit feels like babysitting and I normally notice it not being enforced when I am in game and I am not willing to stop playing to dork with.

Based on one of these reviews, the 3080 is basically twice as fast as the 5700xt at 4K but as tested power is approaching 370W. I guess there is some great engineering to make that all work but 2x performance at 2x power budget isn’t doing it for me.

1.9x perf at like 1.5x power, now we are talking. It seems to be only a setting away. I continued to be baffled by nvidias approach on this.

It feels like a 270W 3080 would have left more room for a 320W 3080S later.
 
TechPowerUp review says undervolting is not possible, but undervolting still works the same way as Turing: editing the voltage-frequency curve.

Stock: 345 W
Undervolt to 0.806 V, 1800 MHz: 293 W
Performance in The Division 2 (frame rate): Identical

 
JHH lies aside, I'm not terribly convinced I should spend the money on a 3080 now. I'd find it remarkable if AMD somehow nudged up along all the models including the 3090, for a slightly lower price and less power usage, allowing room for OC or higher performing models later on.

Of course it's AMD so they'll find a way to mess this up.

I might be the only one, but I would find it interesting if it had an efficiency switch on it - like a 250W setting that didn’t require software.

I wish my 5700xt had one too, like 180W or something similar. The difference in performance is super negligible and I just see less boosting over 2Ghz but the thermals and fan work is so much milder. Making sure the power limit feels like babysitting and I normally notice it not being enforced when I am in game and I am not willing to stop playing to dork with.

Based on one of these reviews, the 3080 is basically twice as fast as the 5700xt at 4K but as tested power is approaching 370W. I guess there is some great engineering to make that all work but 2x performance at 2x power budget isn’t doing it for me.

1.9x perf at like 1.5x power, now we are talking. It seems to be only a setting away. I continued to be baffled by nvidias approach on this.

It feels like a 270W 3080 would have left more room for a 320W 3080S later.
Two schools of thought here.

1. Doesn't AMD software have a power option for usage? I know Nvidia control panel's got a selector for power use during various states. I have mine set on bare minimum for everything. Performance is noticeable, unfortunately.

2. In regard to Supers, one would hope, maybe pray, that by the time Supers come along, Samsung 8N will have matured leading to lower clocks, or the Supers may be delivered on a more efficient and mature node. IDK.
 
Its not profitable.ETH is at 360usd and it will be soon POS so it cannot be mined anymore.

That's hybrid PoW/PoS. Full PoS isn't coming for 2-3 years, barring any delays.

Although yea I wouldn't spend $699 for Eth mining either. It'd take 10 months to reach ROI, at current state. Why would I do that when my already paid off triple RX 470/570s do that already?

Nvidia did the same as AMD with Vega. Pushed the card over the perf/watt curve to meet the target performance.

You can reduce the power limit to 290W and have minimal performance loss. Actually on the 3000 reviews thread, its 5% impact at 270W.
 
Last edited:
Back
Top