Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 187 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
It’s annoying there are no hdr1000 free sync monitors

Might be others, but they exist.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
- The rumor is almost certainly non-sense, NV would just accelerate into Hopper or whatever their next gen release is rather than piddle about with Ampere.

But to the bolded statement above, I've got a healthy level of skepticism of any qualitative statements like "Very good". Its like saying you look "very good" compared to a decaying pile of elephant dung, but that's still not really a good thing :p

Yeah, you need to be careful with CEO statements. Remember when JHH said that most of their 7nm would come from TSMC and this made a few people think that NVidia wasn't using Samsung?

Well he was completely truthful, but the correct interpretation was that they weren't using a 7nm Samsung process so by definition "most" of their 7nm products are being made on TSMC because they don't have any other 7nm products being made elsewhere.

CEOs have mastered a language of making a statement that will lead people to a certain belief even though they didn't actually say that outright. The CEO didn't lie, you just misunderstood them. Unless you really stop to analyze what they're saying and dissect it enough in order to see how it squares with other information, you'll most likely be mislead.

I don't even fault them for any of that. Investors tend to be a pack of panicky idiots and that's on top of generally not knowing a lot about how business is done. Keeping them from doing anything stupid is practically half of a CEO's job these days.
 

lightmanek

Senior member
Feb 19, 2017
387
754
136
Take it with a grain of salt, but I have it on a good authority that for the whole Poland, country of around 40mil., there were 50 RTX3080. Only one of larger online shop here sold more that 700 preorders in 24h, there are at least 4 other big shops with similar volumes.
Nvidia begged large etailers to stop preorders next day, but for competitive reasons no shop complied.

That was first 24h after launch.
Today I've played Quake 2 RTX on one of these 50 cards, ran really well 👌
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,227
5,228
136
Yeah, you need to be careful with CEO statements. Remember when JHH said that most of their 7nm would come from TSMC and this made a few people think that NVidia wasn't using Samsung?


That "CEO statement" seems to be referred to second hand, as something from Chinese media with no direct source.

The funny thing is people seem to want to apply greater care to CEO statements, than they do to outright rumors.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
The funny thing is people seem to want to apply greater care to CEO statements, than they do to outright rumors.

A CEO statement is from a known source and is almost assuredly publicly available so that anyone can see it for themselves. It's one of the most authoritative sources there can be.

Idiots making up nonsense don't face any kind of legal ramifications for what they say. Treating a CEO in the same way seems rather silly and I suspect people have a gut-level reaction that approximates that.

Also there's only one NVidia CEO. There's a whole field of rumor mongers across YouTube and the wider internet. It's not unusual to apply greater care to a single thing than to any one of a large set, much like parents of a single child are often overly involved in that one child's life than they would be in the life of any child out of a family with a dozen children.
 
  • Like
Reactions: xpea

Glo.

Diamond Member
Apr 25, 2015
5,707
4,552
136
Take it with a grain of salt, but I have it on a good authority that for the whole Poland, country of around 40mil., there were 50 RTX3080. Only one of larger online shop here sold more that 700 preorders in 24h, there are at least 4 other big shops with similar volumes.
Nvidia begged large etailers to stop preorders next day, but for competitive reasons no shop complied.

That was first 24h after launch.
Today I've played Quake 2 RTX on one of these 50 cards, ran really well 👌
I presume that one larger online shop was X-Kom? ;)
 
  • Like
Reactions: lightmanek

Ajay

Lifer
Jan 8, 2001
15,451
7,861
136
A CEO statement is from a known source and is almost assuredly publicly available so that anyone can see it for themselves. It's one of the most authoritative sources there can be.

“At the GTC 2019 annual meeting held in Suzhou, China last year, Nvidia CEO Huang Renxun said that most of the next-generation 7nm GPU will be manufactured by TSMC.”. From https://www.gizchina.com/2020/05/05/tsmc-wins-orders-for-nvidia-7nm-and-5nm-chips/

So, no direct quote that I can find - which would make it a third party commentary.
 

ksosx86

Member
Sep 27, 2012
105
44
101
It’s annoying there are no hdr1000 free sync monitors

Might be others, but they exist.
Yeah I use the Acer Predator CG437K Pbmiiippuzx 43" myself
So some random dude who hasn't posted for two years shows up just to vote down my comment.
Care to elaborate @rolodomo ?
HAH that's hilarious - sounds like me. Back from 2012.
 

rolodomo

Senior member
Mar 19, 2004
269
9
81
So some random dude who hasn't posted for two years shows up just to vote down my comment.
Care to elaborate @rolodomo ?
Apologies, the first down vote was a mistake and was removed. I must have accidentally clicked on the down arrow while scrolling through the thread. The second down vote is intentional for overreacting to a meaningless mistake and pinging me (a down vote on an internet forum, the horror, lol).
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
Apologies, the first down vote was a mistake and was removed. I must have accidentally clicked on the down arrow while scrolling through the thread. The second down vote is intentional for overreacting to a meaningless mistake and pinging me (a down vote on an internet forum, the horror, lol).

Hey! My uncle got downvoted on the internet and now he's in the hospital! Let's not make light of the super serious matter that is fake internet points.
 
  • Haha
Reactions: lightmanek

DJinPrime

Member
Sep 9, 2020
87
89
51
I am not trying to say it isn't better. But HDR1000 at 15ft isn't much different than HDR400 at 2ft.
HDR1000 makes a difference in more than just the brightness. Those monitors also have local dimming zones and features that HDR400 do not have. Also the brightness rating is not for the entire screen all the time, it's max for just certain part. So screens with basic HDR rating is not much better than normal screens.

This page break down the various HDR rating.

I have an Acer x35 (hdr1000 monitor) and it's crazy how much of a difference it is when playing the same video with hdr on and off side by side. I use Firefox mainly, so when I first got the monitor I tried out some 4K HDR youtube video and thought wow, they look nice! Then I found out that Firefox doesn't support HDR, so I tried out Chrome and OMG, what a difference. You really have to experience it, and there's no way to show this if you don't have a proper HDR monitor. I hope we get way more hdr contents, because the difference is pretty dramatic. So, freesync monitors need to step up.
 

Ajay

Lifer
Jan 8, 2001
15,451
7,861
136
Apologies, the first down vote was a mistake and was removed. I must have accidentally clicked on the down arrow while scrolling through the thread. The second down vote is intentional for overreacting to a meaningless mistake and pinging me (a down vote on an internet forum, the horror, lol).
Sorry, up and down votes are really supposed to be in response to whether or not a question has been answered (well, originally). Currently, they are being used to cite approval/disapproval - that change continually triggers me. It's a pet peeve I should get over except me being a damn stubborn human being.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,810
7,166
136
Sorry, up and down votes are really supposed to be in response to whether or not a question has been answered (well, originally). Currently, they are being used to cite approval/disapproval - that change continually triggers me. It's a pet peeve I should get over except me being a damn stubborn human being.

- Wrong threat to have this discussion, but never liked downvoting systems either.

With an upvote, you know that the upvoter approves of something in the confines of the post being upvoted and can quote and reply for any added nuance. "I know why people agree with me"

With a downvote, the downvoter disagrees with the post for a potential infinite number of reasons, unless the downvoter quotes and responds with clarify what point of the post they disagreed with an why. "I don't know why people disagree with me"

Downvoting stifles discussions by giving dissenters an avenue to voice their dissent without justifying it.
 

AdamK47

Lifer
Oct 9, 1999
15,216
2,839
126
I found that using a modified clock curve in MSI Afterburner is the best way to increase performance while maintaining stability and stabilize clock speed fluctuations with Ampere.

I start with a standard +50 increase and taper out the upper middle section to between +90 and +100. Then I taper it back down to +50.

QXI3yNn.png


Using this curve and the 480W BIOS from the Asus Strix has made clock fluctuations a lot tighter.
 

Ajay

Lifer
Jan 8, 2001
15,451
7,861
136
I found that using a modified clock curve in MSI Afterburner is the best way to increase performance while maintaining stability and stabilize clock speed fluctuations with Ampere.

I start with a standard +50 increase and taper out the upper middle section to between +90 and +100. Then I taper it back down to +50.

QXI3yNn.png


Using this curve and the 480W BIOS from the Asus Strix has made clock fluctuations a lot tighter.
Nice, that's what I do. Takes freaking forever though since one has to move each individual square one at a time.
 
  • Like
Reactions: AdamK47

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Especially as the undervolting suggests that there is likely to be decent scope to squeeze a little bit out that way.

Double density gddr6x due at some point as well, then they really can do a 20gb 3080 etc sensibly.
 

Kuiva maa

Member
May 1, 2014
181
232
116
There is no way NVIDIA is planning to refresh on 7nm at this time. Chances are they will focus on binning and eventually launch super variants that clock higher.

Nvidia is having a yearly product cadence strategy. They will refresh Ampere next year in some way. If that will be the same way they did with Turing Super cards (binning, bigger/less cut down chips etc) or by changing node remains to be seen. If TSMC is involved we will know very soon.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,227
5,228
136
Nvidia is having a yearly product cadence strategy. They will refresh Ampere next year in some way. If that will be the same way they did with Turing Super cards (binning, bigger/less cut down chips etc) or by changing node remains to be seen. If TSMC is involved we will know very soon.

There is a huge difference between just reusing the same chip a different way, and switching fabs. The former is nearly free, while the latter cost tens of millions of dollars in up front costs.

NVidia won't respin these big Ampere chips at TSMC next year.

If TSMC is involved it will be for the smaller chips we haven't seen yet: 3060/3050.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
I use a curve on mine as well. Wish I'd gotten a different SKU, but I jumped on whatever I had in stock. I can't go too low with mine; .95v and I can hold about 2000core with the GPU fully utilized. If I go down to 1900, I can run .85 though, I like the nice round 2000 though.

These cards really should be undervolted. You lose hardly any performance, but power usage and temps go way down.
 

Bouowmx

Golden Member
Nov 13, 2016
1,138
550
146
Tech YouTuber Optimum Tech tests undervolt on GeForce RTX 3080 and 3090 at multiple frequencies.
The tests used are strange. Power (taken from software HWINFO) is tested in Unigine Heaven, really old software from 2009. Game performance is tested in Rainbow Six: Siege, not a graphics-heavy game.

Frequency (MHz)Volts (mV)GeForce RTX 3080 power (W)GeForce RTX 3090 power (W)
2000987315388
1950931291357
1900887262317
1850850231289
1800806215275
1750786203251


Extrapolating further to 1700 MHz, the boost frequency listed in specification, I see Ampere has the same power savings I experienced in undervolting Pascal: saving 30-40% power.
Unigine Valley NVIDIA GeForce GTX 1070.png