Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 179 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,699
136
Every review i am reading is sand bagging the 3090.
Every other review i am reading is saying a 3080 20gb is in the works.

Every review i read is killing me, because it is saying i should WAIT for more news on the 20gb because the 3090 is a box of fail for a gamer and not developer.
How is the 3090 sandbagging? Even if nvidia increased the power budget, overclocking results don't seem to show much improvement and power consumption goes through the roof. This really does seem to be the upper end capability of GA102, outside a possible future fully enabled card.
 

blckgrffn

Diamond Member
May 1, 2003
9,127
3,069
136
www.teamjuchems.com
It's not that hard to believe and reminds me a lot of what we saw with AMD cards when Polaris/Vega first came out. You could give them a fairly substantial undervolt and save a lot of power without losing that much performance. Some people even saw a performance increase from a slight undervolt, which sounds absurd the first time you hear it, but was borne out by multiple sources.

I think NVidia just ran into the same sort of problem where the architecture just can't go any further on the current process without just throwing a lot of power at it for highly diminished returns. Would anyone have seriously cared if a 3080 only had 97% of the current performance if it came in at 230W? Releasing such a card with a lower power limit might have at least let people hold out hope for more powerful SUPER refresh with higher clock speeds and power limits later on, but what we have makes that prospect look doubtful unless the Samsung process sees considerable improvements.

I would have been way more impressed with ~90% of the performance at ~65% of the power usage.

And the 3090 would have made a lot more "sense" in terms of being the SKU with unlocked power budget. Not that it would have changed the card any, but the perceived utility of the investment would have. Pay $700 for a tightly regulated, very performant card or pay $1,500 for one you can "control".
 
  • Like
Reactions: Mopetar

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
How is the 3090 sandbagging? Even if nvidia increased the power budget, overclocking results don't seem to show much improvement and power consumption goes through the roof. This really does seem to be the upper end capability of GA102, outside a possible future fully enabled card.

I believe 'sand bagging' in this case is synonymous with 'dragging it through the mud'. eg: The card is a waste, don't get it.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,823
7,186
136
Yup nvidia has been doing the literal opposite of sandbagging with the 3000 series.
They over promised prior to launch and the cards are clearly running on the bloody edge of where it is possible.

-Unless they're playing the 5D sandbagging game and faked an ENTIRE LAUNCH just to surprise us with the real launch a month from now with cards that perform the way we expect them to :p
 
  • Haha
Reactions: blckgrffn

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
It seems like a lot people don't know what the term means.

Basically it means: Under-performing on purpose.

Usually with the intent of surprising with better performance later.

Its all on point of view.

nVidia sandbagging = purposefully dragging down performance

Reviewers sandbagging = dragging the reviewed production down resulting in poor reviews
 

blckgrffn

Diamond Member
May 1, 2003
9,127
3,069
136
www.teamjuchems.com
Its all on point of view.

nVidia sandbagging = purposefully dragging down performance

Reviewers sandbagging = dragging the reviewed production down resulting in poor reviews

Hmm, maybe. I don't think I've ever heard it in that context here.

If reviewers were sandbagging the product, that would mean that there is some amazing use case that shows big gains that they can't show yet because for some reason nvidia won't let them.

Contrary to that, they've appeared to have laid all the cards out on the table and are not holding anything back. ie, not sandbagging.
 

Mopetar

Diamond Member
Jan 31, 2011
7,843
5,998
136
Every review i read is killing me, because it is saying i should WAIT for more news on the 20gb because the 3090 is a box of fail for a gamer and not developer.

I don't know if I'd consider a 3090 as a gaming card unless money is no obstacle. A 3080 is about just as good at half the price.

If I could guarantee myself one right now for MSRP I'd definitely buy it. The value is just too good and AMD will probably price whatever they have relative to the 3080 price since they seem to be more interested in making money these days.

I suppose there's some wait and see temptation with a 20 GB 3080, but we don't know what that will cost. If it's $1000 or $1100 for that card I'm not sure if some would feel it was worth the wait.

Of course if you can't get your hands on one right now then waiting is about all you can do one way or another.
 
  • Like
Reactions: lightmanek

jpiniero

Lifer
Oct 1, 2010
14,618
5,227
136
That's ~1%-1.5% difference in clock speed. Margin of error stuff.

Seems it's less than that, and power draw difference isn't much and might even draw less. Doesn't mean it doesn't look real bad they had to do this.

 
  • Like
Reactions: Elfear

Saylick

Diamond Member
Sep 10, 2012
3,171
6,404
136
Seems it's less than that, and power draw difference isn't much and might even draw less. Doesn't mean it doesn't look real bad they had to do this.

Yeah, glad to see Nvidia go out of their way to "hotfix" this issue, which is strongly reminiscent of the RX480 PCIe power overdraw situation.

Still not a good look from a PR perspective as it just adds more fuel to the notion that Ampere was rushed, but a step in the right direction nonetheless.
 
  • Like
Reactions: Stuka87 and Elfear

sze5003

Lifer
Aug 18, 2012
14,183
625
126
I don't know if I'd consider a 3090 as a gaming card unless money is no obstacle. A 3080 is about just as good at half the price.

If I could guarantee myself one right now for MSRP I'd definitely buy it. The value is just too good and AMD will probably price whatever they have relative to the 3080 price since they seem to be more interested in making money these days.

I suppose there's some wait and see temptation with a 20 GB 3080, but we don't know what that will cost. If it's $1000 or $1100 for that card I'm not sure if some would feel it was worth the wait.

Of course if you can't get your hands on one right now then waiting is about all you can do one way or another.
I've had the same realizations but at the end of the day I want to keep that 20gb 3080 3 or more years and still be able to crank settings to highest. So I'll wait with my 1080ti until end of October to see what the response is to amd.
 

Kuiva maa

Member
May 1, 2014
181
232
116
I would have been way more impressed with ~90% of the performance at ~65% of the power usage.

And the 3090 would have made a lot more "sense" in terms of being the SKU with unlocked power budget. Not that it would have changed the card any, but the perceived utility of the investment would have. Pay $700 for a tightly regulated, very performant card or pay $1,500 for one you can "control".

The problem with this approach is that there is a card with 90% of the 3090 performance and it is called 3080. So the 3080 would need to also be a bit slower and that would put it very close to the 2080Ti making it quite less appealing. Nvidia apparently absolutely had to push those cards as much to hit their performance targets.
 
  • Like
Reactions: Saylick

blckgrffn

Diamond Member
May 1, 2003
9,127
3,069
136
www.teamjuchems.com
The problem with this approach is that there is a card with 90% of the 3090 performance and it is called 3080. So the 3080 would need to also be a bit slower and that would put it very close to the 2080Ti making it quite less appealing. Nvidia apparently absolutely had to push those cards as much to hit their performance targets.

I am talking about the 3080 specifically and in response to a couple posts that indicated that it could likely/maybe be significantly undervolted, vastly reducing its power consumption but only slightly reducing it's performance.

Specifically, if you missed it a page or two back:

WCCFTech but testing GeForce RTX 3080 undervolt with the NVIDIA PCAT tool (not relying on software power values).

Results in Forza Horizon 4, 4K Ultra:
Stock ~ 1955 MHz:
143 fps, 310 W

862 mV ~ 1905 MHz:
139 fps, 230 W
97% performance for 74% power

806 mV ~ 1815 MHz:
134 fps, 201 W
94% performance for 65% power

Also included for reference: 2080 Ti, but it was not undervolted.


"A bit slower" still puts it far ahead of the 2080ti which it is still replacing at a price point of $500 less. We can quibble on "a bit" but I am talking about card that could have had 90-95% of the current 3080 as discussed in prior comments. So some 40% faster still, vs 50% faster? ¯\_(ツ)_/¯ At 60% of the price and +40% of the performance at the highest resolutions vs the 2080 ti and maybe 250W or less (200W?!?) power rating it would have still had lots and lots of value... and maybe it would be more trivial for AIB partners to package reliably. Certainly anyone running a current Ti or **80 series card could have felt pretty good about installing it with their current PSU, which is not a current given.

I really believe that nvidia released the wrong version of the 3080 if they wanted to have the 3090 be relevant, had they been able to release that version. Which, as others have pointed out w/regards to under-volting, is not a sure thing.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
I really believe that nvidia released the wrong version of the 3080 if they wanted to have the 3090 be relevant, had they been able to release that version. Which, as others have pointed out w/regards to under-volting, is not a sure thing.

It's probably far less about the 3090 and more about rdna2. I still think the 3090 only really exists to try and retain the crown. The small difference between them and how on edge the 3080 runs is down to expected strength of competition. 90-95% might have put them below their internal estimate for the 6000 series.
 

blckgrffn

Diamond Member
May 1, 2003
9,127
3,069
136
www.teamjuchems.com
It's probably far less about the 3090 and more about rdna2. I still think the 3090 only really exists to try and retain the crown. The small difference between them and how on edge the 3080 runs is down to expected strength of competition. 90-95% might have put them below their internal estimate for the 6000 series.

That's fair, but the 3090 would have remained their ace in the hole, along with a 3080S/Ti drop early next year if necessary.

Right now it seems hard to see how there is room for a refresh SKU that slots between them (the 3080 and 3090)

Honestly, I hope you are right because that's great news for RDNA2.

That said, there are lots of people who won't buy AMD regardless of the value prop. So long as nvidia is "neck and neck" (plus or minus what, 10%?) they are going to stay true or migrate from AMD as planned. I'd still wager that having a solid 3080 @ 250W or less would leave them in a better position. It would also have let their partners make more money at the $700 price point via less involved coolers and possibly PCBs (GDRR6X tho? not sure) with a card that needed less juice.
 

Mopetar

Diamond Member
Jan 31, 2011
7,843
5,998
136
Seems it's less than that, and power draw difference isn't much and might even draw less. Doesn't mean it doesn't look real bad they had to do this.


If you compare the readings that they got the old drivers would see spikes in the power draw up to 600W for very small amounts of time:

02a-Gaming-Zoom-Power-Old-Drivers.png


Compared with the new drivers, it seems like the card rarely pushes much above 500W now:

12a-Gaming-Zoom-Power-New-Drivers.png


That's a probably 80W or more in terms of difference. I wonder then if it's just an issue of people with a good power supply that isn't quite good enough, particularly if coupled with a lot of other hungry components.

The changes aren't going to be perceivable based on the limited testing they did with the Witcher 3. I doubt anyone notices going from 99.9% of all frames being delivered in under 4ms, to only 99.8% of frames being delivered in under 4ms.

I'm not too concerned with how they look over having to fix the problem in the first place when they did a good job of fixing it so quickly and about as painlessly as possible.
 

Hitman928

Diamond Member
Apr 15, 2012
5,316
7,994
136
If you compare the readings that they got the old drivers would see spikes in the power draw up to 600W for very small amounts of time:

02a-Gaming-Zoom-Power-Old-Drivers.png


Compared with the new drivers, it seems like the card rarely pushes much above 500W now:

12a-Gaming-Zoom-Power-New-Drivers.png


That's a probably 80W or more in terms of difference. I wonder then if it's just an issue of people with a good power supply that isn't quite good enough, particularly if coupled with a lot of other hungry components.

The changes aren't going to be perceivable based on the limited testing they did with the Witcher 3. I doubt anyone notices going from 99.9% of all frames being delivered in under 4ms, to only 99.8% of frames being delivered in under 4ms.

I'm not too concerned with how they look over having to fix the problem in the first place when they did a good job of fixing it so quickly and about as painlessly as possible.

Any decent PSU should be able to handle momentary transients like that, but beyond that people with 1+ KW power supplies were reporting crashing issues as well, so I don't think the issue was power supply related, although some with lower rated PSUs may have had issues as well.
 

mrblotto

Golden Member
Jul 7, 2007
1,647
117
106
I’m curious: how does one find out if their good PS is ‘good enough’?
I’ve got an EVGA Supernova 850 from about 4 years ago. I believe performance degrades gradually over time. I’m just looking to see how it measures up today
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Any decent PSU should be able to handle momentary transients like that, but beyond that people with 1+ KW power supplies were reporting crashing issues as well, so I don't think the issue was power supply related, although some with lower rated PSUs may have had issues as well.

Its not just a matter of handling super spiky loads. Its handling these kinds of transients and still providing clean power.

LTT said they had issues with a PSU that was well above nVidia's recommendation. At the time they thought it was just the PSU. Although now it appears that they were seeing this issue. It takes an especially good supply to handle the kind of load shown in the above graph. Spiking back and forth from 200W to 600W over the course of 20µs is NOT a normal load situation for a computer power supply.
 

Hitman928

Diamond Member
Apr 15, 2012
5,316
7,994
136
Its not just a matter of handling super spiky loads. Its handling these kinds of transients and still providing clean power.

LTT said they had issues with a PSU that was well above nVidia's recommendation. At the time they thought it was just the PSU. Although now it appears that they were seeing this issue. It takes an especially good supply to handle the kind of load shown in the above graph. Spiking back and forth from 200W to 600W over the course of 20µs is NOT a normal load situation for a computer power supply.

Yeah, I didn't notice the time scale before, having extreme spikes that last that long is definitely too much for even most quality PSUs if you are up exceeding their current limits on the rail(s) the GPU is pulling from (or even overall depending on how good the isolation in the PSU is). Still, people with over spec'd PSUs were still reporting crashing so there was multiple issues at play.
 

DiogoDX

Senior member
Oct 11, 2012
746
277
136
Its not just a matter of handling super spiky loads. Its handling these kinds of transients and still providing clean power.

LTT said they had issues with a PSU that was well above nVidia's recommendation. At the time they thought it was just the PSU. Although now it appears that they were seeing this issue. It takes an especially good supply to handle the kind of load shown in the above graph. Spiking back and forth from 200W to 600W over the course of 20µs is NOT a normal load situation for a computer power supply.
If I remember well in the pascal (or maxwell?) presentation nvidia bragged of the more clean power. Looks like with the ampere they just trowed power in it to achieve the performance target.
 
  • Like
Reactions: Stuka87

CakeMonster

Golden Member
Nov 22, 2012
1,392
501
136
Got the Asus TUF 3090 OC yesterday (with some rebates), but I'm not going to defend the price in any sense. First (and mixed) impressions:

I've been switching between 32" 4K and 30" 2560x1600 displays but I'll probably end up with the latter since 4K has so many issues from windows scaling to still not being able to use all effects in recent games. I also can't really let go of IPS and 16:10. I have a 9900K @ 5GHz all cores with 3600 CL16 RAM. I did not use DLSS at all.

* Welcome to CPU limitation town. I know everyone can read this from benchmarks, but damn its super obvious in practice. WoW is down in the exact same 70 fps range in crowded areas in regular 1600p, 4K, and downsampled to 1600p from 3200p (>4K), same with a number of other games.

* The averages are ofc much higher despite CPU limitation often grabbing you and bringing you back to to where a 2080/2080Ti would also be CPU limited but it wasn't as obvious at the time. Now its super obvious because of the higher ceiling.

* Control with RT and AC Odyssey were great experiences as they didn't seem to hit the CPU ceiling (at least not often), I had a much better experience managing to stay reliably above 60 where I would very often dip below with the 2080Ti (exact same settings). Specifically in Control I could enable 3/5 RT settings in 1600p and run in high 70s average, where as with 2080TI 2/5 RT settings would run low 70s and often drop sligthly below 60fps.

* VRAM usage doesn't appear to be much different. I've heard some say that with more VRAM the games or windows or whatever will cache more. However, I rebooted many times and probably ran less background stuff than I usually do. Horizon Zero Dawn made me end up at the highest value I've seen so far as reported by GPU-Z, 11800 MB at ultra settings (and this was at 1600p).

No idea how to conclude so far, not particularly interested in arguing about the price since I'm not going to defend that anyway. I think that 3080 and 3090 buyers will be soon be following the CPU market extremely closely because it seems that frame dips and minimums will benefit greatly from better CPU performance (and max too in a few games).
 
Last edited:

sze5003

Lifer
Aug 18, 2012
14,183
625
126
Sounds like the cards themselves are more for computational use than gaming. Judging by how the 1440p gains between the two are not that large. Most benchmarks I've seen used the latest intel processor.

Looks like I may end up with a 3080 if that 20gb model never arrives.
 

jpiniero

Lifer
Oct 1, 2010
14,618
5,227
136

Videocardz has a shot of GA104 Mobile.