Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 168 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,227
5,228
136
While you're right that a 3080 Ti will likely be released, and it likely will perform about the same as the 3090, I think you're missing the bigger picture. Traditionally the x80 is about 2/3 (67%) of the Titan, and the x80 Ti is ~93% of the Titan. Maxwell was 91.7, Pascal 93.3, Turing 94.4 for shaders. That gave Ti performance close to Titan levels, and was still a massive step up over the x80. Here the 3080 is already 83% of the 3090. It's way more performant relative to the 3090 than any historical x80 to Titan, and that makes the Ti less compelling. Even if they give us a 78 SM 3080 Ti vs the 82 SM 3090 (95%), that's still less the 15% more shaders than the 3080.

A 20GB 3080 Ti with 15% more shaders than a 3080 for $800 next summer might not be a bad card, but it's sure not the kind of increase we're used to seeing with the mid cycle Ti refresh.

Check Kepler. GTX 780 was ~85% of Titan. Very similar to now. Probably no coincidence that Kepler also shared the same die between 780, Titan and 780 Ti, also potentially the same as now.

Kepler solved the issue you are indicating by actually enabling more units on 780 Ti, than were enabled on Titan. Though that probably won't happen, but nothing stopping them from enabling the same as 3090.

I agree the delta might not be as larger as previous x80 Ti over x80, but still I expect we will get 20% which is not too bad.

Also doesn't change the point that the 3080 is not the 3080 Ti. Maybe the 3080 Ti won't be quite as impressive as before, but that is no reason to pretend the 3080 is actually the secret 3080 Ti, in order make worse looking comparisons.

When the the 3080 Ti shows up, make the appropriate x80 Ti comparisons then.

Right now the 3080 is a solid x80 release.
 
Last edited:

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
As an aside, there's a lot of people here who complain that Nvidia has been selling small die xx4 chips as x80s for awhile, you think they'd be happy to have a big die in the x80.

We are. However, this is Nvidia after all. They couldn't be caught offering an honest, large-die value product so they screwed us with 10GB of Vram, AHAHHA! The jokes on us!
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,699
136
Check Kepler. GTX 780 was ~85% of Titan. Very similar to now. Probably no coincidence that Kepler also shared the same die between 780, Titan and 780 Ti, also potentially the same as now.

Kepler solved the issue you are indicating by actually enabling more units on 780 Ti, than were enabled on Titan. Though that probably won't happen, but nothing stopping them from enabling the same as 3090.

I agree the delta might not be as larger as previous x80 Ti over x80, but still I expect we will get 20% which is not too bad.

Also doesn't change the point that the 3080 is not the 3080 Ti. Maybe the 3080 Ti won't be quite as impressive as before, but that is no reason to pretend the 3080 is actually the secret 3080 Ti, in order make worse looking comparisons.

When the the 3080 Ti shows up, make the appropriate x80 Ti comparisons then.

Right now the 3080 is a solid x80 release.
Kepler was a bit of an oddball, and it was before they came out with their current cadence. It launched with the GK104 based 680 just as subsequent generations have, and then launched the GK110 Titan the following year. Later, rather than releasing a 680 Ti, they released the 780, punted the full GK104 down to the 770, and rebranded the rest of the stack. Then half a year after that they finally launched a full GK110 card as the 780Ti, which was actually more enabled than the original Titan, and then several months later relaunched the fully enabled 6GB TItan Black.

It was 21 months between the launch of Kepler with the 680 and the release of the 780 Ti, partly because it took 2.5 years to get Maxwell 2 out the door. Realistically the 780 is the mid generation big die refresh for Kepler.

There's no way we get 20% for the 3080 Ti over the 3080 unless games start needing more than 10GB by then. Even a full 84 SM GA102 only has 23.5% of the shaders that the 3080 does, at best you might see 20% from a fully enabled die in a few extremely GPU bound titles if they push clocks a bit, but that would be a 3x8pin monster.
 

Saylick

Diamond Member
Sep 10, 2012
3,141
6,355
136
Exact the same with Vega because of the high power draw. But now since it is with Nvidia will be a "feature".

Acedoctal evidence: In other foruns I see the same users that would not buy Vega because of the high power consumption changing PSU to but a 3080.
As much as I want to point the finger at Nvidia fanboys and their hypocrisies, I think it's fair to rag on Vega and not be willing to upgrade a PSU for it but be willing to upgrade a PSU for the 3080 because RX Vega didn't even take the performance crown when it launched. The GTX 1080 Ti was the performance leader and it consumed less power than Vega 64.

The RTX 3080, on the other hand, isn't a revolution in perf/W, but it is the defacto performance leader at the moment. If people need top crown performance, people will do whatever it takes to get it, including upgrading their PSU.

90053.png

90110.png
 

xpea

Senior member
Feb 14, 2014
429
135
116
When you have a duopoly, the Ricky Bobby rule applies: if you're not first, you're last.
^^This^^
In fact, AMD has already lost the PR battle. What people do when they want to buy a new toy ? They look at products in their price bracket and read the reviews. Today, most Ampere benchs show Ampere on top because it's compared with last generation. Now move forward in November, when RDNA2 reviews will be online, what will happen ? it will obviously be a mix bag because 3090 OC AIBs models will at the top of the hill, ie RDNA2 will not get the crown mindshare. Big navi nvidia killer yadada will be a the new meme, like poor Volta...
 

jpiniero

Lifer
Oct 1, 2010
14,591
5,214
136

Videocardz found evidence of the 3080 20 GB and the 3070 Ti 16 GB coming soon after Big Navi. 3080 20 GB could be called 3080 Super.
 

Head1985

Golden Member
Jul 8, 2014
1,864
689
136

Videocardz found evidence of the 3080 20 GB and the 3070 Ti 16 GB coming soon after Big Navi. 3080 20 GB could be called 3080 Super.
They should launch it with those vram spec.
3070 8GB in almost 2021 is fu**** joke.Same vram as rx 480/470.
3080 10GB 4K card is joke.
Both cards will be out of vram when next gen games launch.6-12months probably.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Exact the same with Vega because of the high power draw. But now since it is with Nvidia will be a "feature".

Acedoctal evidence: In other foruns I see the same users that would not buy Vega because of the high power consumption changing PSU to buy a 3080.

Not really the same situation? With Vega there was competition that had noticeably lower power consumption at the same performance. Today, right now, consumers don't have a choice if they want to upgrade. It's eat the high power draw or go bury your face in the sand.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
I suspect NVidia's plans will change based on what AMD does, but I think it may be a while before we see any SUPER cards (and I don't think we see anything branded as a Ti at all this generation) due to supply constraints.

In addition to the memory uplift I hope that NVidia holds off for long enough for the 8nm process to get some kinks worked out. I suspect between that and a new spin they'll have a card worth releasing.

It's pretty obvious that they can't push the clocks much further and honestly I think they'd look better if the SUPER cards all had a lower TDP even if it means they don't get a minor clock bump.

Arguments over names are kind of pointless, but I think we all know why a Ti wasn't released. They'd rather not sully the brand. But even if you think the 3080 should have been called a Ti or compared to one does it change anything? It's still the best card you can get and a Ti for $700 again is a damned good deal.

If NVidia feels they need more cards to fill in the gaps the could just release 3070 or 3080 versions with more memory. No need to call them a SUPER. I'm also curious about 16XX replacement cards.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,227
5,228
136
I suspect NVidia's plans will change based on what AMD does, but I think it may be a while before we see any SUPER cards (and I don't think we see anything branded as a Ti at all this generation) due to supply constraints.

For the GDDR6X cards, double VRAM might be waiting for 2GB GDDR6X chips so they don't have to use the 3090 design with 1GB VRAM chips on the front and back of the board. GDDR6 already has 2GB chips.

Arguments over names are kind of pointless, but I think we all know why a Ti wasn't released.

Because yields are still too low for an affordable, more fully populated die. See Kepler Titan -> 780 Ti months later, Maxwell Titan -> 980 Ti months later, and Pascal Titan -> 1080 Ti months later.

They kind of mangled it with Turing, releasing the Ti card first and at very high pricing.

I expect this time is a return to the previous launches, where some months later, 3080 Ti with 3090 performance for a much more reasonable price will arrive.

If NVidia feels they need more cards to fill in the gaps the could just release 3070 or 3080 versions with more memory. No need to call them a SUPER. I'm also curious about 16XX replacement cards.

This time, I think NVidia put prices to a point where they won't need Super cards. AMD really doesn't have much room to undercut NVidia enough, that NVidia will feel a need to respond.

The only wild card is VRAM capacity, and how they handle that. I expect that will just be an expensive extra cost option, not dropping in as a free upgrade on existing cards.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,841
3,189
126
Edit:

Sigh... anyone get a 3080?
I hear its like trying to get hand sanitizers all over again, only in regards to a gpu which people will probably not get to use for what its intended to be used for.
 
Last edited:
  • Like
Reactions: lightmanek

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
A hypothetical 3080 Ti is unlikely to have much room to breathe just because NVidia is up against a hard power limit.

Why make a 3080 Ti that won't be as good as a 3090 and will draw comparisons against other Ti cards. The 3090 exists so that they don't have to call something a Ti or a Titan that will be a lackluster entry for what's supposed to be a premium series.

People forget but the xx90 cards were the dual-die ones that we haven't seen since Kepler. They've repurposed it here to be the new high end and it's because they're in a rough spot with this generation and don't have anything worthy of bearing the Ti or Titan designations.

Prices aren't where they are because NVidia wants to be generous. They're that way because they know on some level that pushing them higher could be a major PR disaster if AMD undercuts them with similar performance at a price difference of more than $100.

Even though there aren't a lot of cards in the market right now, NVidia launching first was still a good move because they get to dictate prices to AMD to an extent. If AMD had something to compete with the 3080 and were hoping to charge $900 for it those hopes are dashed. Even if they beat NVidia there, they probably can't charge more than $800 and if we're being honest then they're stuck at $700 even if they beat NVidia on average.

AMD has historically offered versions of their cards with more RAM. I think NVidia will be forced to match that regardless of what they call those cards. A close race in terms of performance with the main difference being 10 GB vs. 16 GB has an obvious answer to anyone without brand preference.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
Edit:

Sigh... anyone get a 3080?
I hear its like trying to get hand sanitizers all over again, only in regards to a gpu which people will probably not get to use for what its intended to be used for.

I have a preorder for one at Amazon that sold out pretty quick. Was about an hour ago. It's a Zotac Trinity OC.
 

Chrhun

Junior Member
Sep 8, 2020
3
2
41
Edit:

Sigh... anyone get a 3080?
I hear its like trying to get hand sanitizers all over again, only in regards to a gpu which people will probably not get to use for what its intended to be used for.

I got lucky and was able to source one on the launch day, yesterday. Just got one that was available, a MSI Ventus OC 3080. Got it installed and now I have the power to run just any game well on my 3440x1440/100Hz UW. My main gaming is simracing with VR though and I'm waiting for the HP Reverb G2 to launch.

Ventus installed
 

tajoh111

Senior member
Mar 28, 2005
298
312
136
As much as I want to point the finger at Nvidia fanboys and their hypocrisies, I think it's fair to rag on Vega and not be willing to upgrade a PSU for it but be willing to upgrade a PSU for the 3080 because RX Vega didn't even take the performance crown when it launched. The GTX 1080 Ti was the performance leader and it consumed less power than Vega 64.

The RTX 3080, on the other hand, isn't a revolution in perf/W, but it is the defacto performance leader at the moment. If people need top crown performance, people will do whatever it takes to get it, including upgrading their PSU.

90053.png

90110.png

I think the think that made Vega particularly bad in this regard was just the number of excuses that kept on adding up.

You have to remember Vega was 16 month to 17 months later than the competition and AMD fans during that period did everything they could to stall people from buying a GTX 1080 and 1080 ti.

During this time a lot was said about Pascal, how about it was not impressive, how Vega was going to beat the 1080 ti and to just wait for Vega. Add on this wait got longer and longer kept on getting extended and people were at wits end. This bet on people to wait had a cost and this was the good will and faith people had in AMD and it's fan base that advertise for them.

Then the card came, under performed, was priced just like the competition on top of being loud and hot. People had enough and just wanted to get their videocards now which at this point wasn't a Vega 64 anymore and was likely the GTX 1080 ti.

Then more bets in good will were made by AMD fans to still buy Vega based on underclocking, Vega 64 could not be sold for less because of HBM2 and to wait for drivers. What these fans did not realize is that the Vega 64 launch had bankrupted their good will. No one wanted to listen to these excuses anymore after waiting 16 months for a product so disappointing vs the hype.

This launch is nothing like that. People really got hyped for Amphere over the last 2 weeks and although the claims were outlandish in some cases(the 1.9x performance per watt claim), most people knew what to expect based on the digital foundry preview(which were really inflated by 10% only). There wasn't this huge wait time where expectations grew and the most impressive thing about the GTX 3080, the price, was still true. People knew it was going to be a 320watt power usage prior to launch so this wasn't an unpleasant surprise for people. Additionally unlike AMD, Nvidia gave the cards sufficient coolers so the cards did not run hot, throttled and loud which prevents the Flak Vega received.

The fact that AMD fans are comparing this launch to Vega is absurd. If Vega had captured the performance crown handily while healthily moving the price to performance bar forward while not being late, it's reception would have been great even with it's power consumption numbers.
 

jpiniero

Lifer
Oct 1, 2010
14,591
5,214
136

3090 20% faster than 3080 in Timespy and Timespy Extreme.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,227
5,228
136

3090 20% faster than 3080 in Timespy and Timespy Extreme.

Expected that it would have about 20% more perf, since it has ~20% more CUDA cores and ~20% memory Bandwidth.
 

Head1985

Golden Member
Jul 8, 2014
1,864
689
136
Thats almost perfect scalling.I didnt expect that.More like 10-15% at best.This is bad news for 3070 because 3080 have 48% more SP and 70% more bandwidth.And because 3080 is only 30% faster than 2080TI in 4K i dont see how 3070 can be still faster than 2080TI.It looks more like 10% faster than 2080super.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
Thats almost perfect scalling.I didnt expect that.More like 10-15% at best.This is bad news for 3070 because 3080 have 48% more SP and 70% more bandwidth.And because 3080 is only 30% faster than 2080TI in 4K i dont see how 3070 can be still faster than 2080TI.It looks more like 10% faster than 2080super.

You can pretty much bet NV was saying the 3070 is faster when RTX and DLSS is enabled. I can believe that if that's what they were trying to say.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,227
5,228
136
This is bad news for 3070 because 3080 have 48% more SP and 70% more bandwidth.And because 3080 is only 30% faster than 2080TI in 4K i dont see how 3070 can be still faster than 2080TI.It looks more like 10% faster than 2080super.

100% agree.

After the Ampere Kitchen reveal said 3070, equal or faster than 2080 Ti, and I saw the specs, I have thought there is no way this is anything except a few corner cases. I am expecting massive letdown when 3070 gets reviewed.

Just look at the massive difference in memory BW between 3070 and 2080 Ti.