Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 84 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
293
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

Konan

Senior member
Jul 28, 2017
360
291
106
It's very likely that graph is fake. I would be very surprised If RTX3060 had only 6GB Vram and such a card wouldn't be futureproof.

Agree buddy. It'll be interesting to see if the other rumors of 60 SKU comes with 12GB and the 70 SKU with 16GB vs KatCorgi
1597601844504.png
 

Karnak

Senior member
Jan 5, 2017
400
773
136
That might be enough. Say the 3070 Ti FE actually hits that... same shader count as 2080 Super, and it's 20% faster. 9% frequency boost plus 10% IPC gain gets you pretty close.
But it won't be a frequency boost since Turing has the same 2100MHz core clock limit. That limit has nothing to do with real clock speeds.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,696
3,260
136
Well, that is assuming if there is no more room for improvement in terms of memory efficiency. While I don't think that graph is real, we would just need performance figures from 3070 and 3080 next month to guess where 3070ti will stand
I didn't say there won't(couldn't) be any improvement in terms of memory efficiency, I just don't think It would be 20-25% better compared to the previous generation.
 

CakeMonster

Golden Member
Nov 22, 2012
1,630
809
136
Quick math on the 4K figures, 20% over 2080Ti for the 3080, 40% over 2080Ti for the 3090.

Nope, I obviously don't believe its real, but at least the numbers are plausible.
 
  • Like
Reactions: Konan

GodisanAtheist

Diamond Member
Nov 16, 2006
8,323
9,703
136
Any word on Samsung yields?

Hopefully they gave NV a sweet enough of a deal and have yields high enough they can really aggressively price these parts, at least their 3070 line, aggressively against the next gen consoles and RDNA 2.

Consumers are desperately in need of a price war here, and so long as supply factors work and the competition shows up, we may get a good round of cards.
 
  • Like
Reactions: ozzy702

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
I get the feeling that if 1080Ti people want to stay in a similar price bracket, they may be stuck with the 3060 cards or maybe a lower model 3070 for about a 15% performance uplift. I'd think that after Turing, Nvidia would want to offer people an upgrade. I can't imagine them leaving so many people with a 1080Ti without a viable upgrade path yet again. Does it make sense for them to do that? I honestly don't know. If the only impressive upgrade comes in at $1000 or higher, then they can't be interested in making a sale to their traditional enthusiast crowd. They would clearly be moving on to a different customer altogether. It just seems odd to me. Even if the 2080Ti had been $700, I would have past it up. It didn't bring enough performance to the table over what I already had. If I'm stuck with a 3070Ti at $800 that is basically on par with a 2080Ti, then I'll be skipping 2 entire generations in a row, which I don't recall ever doing before. I can't be alone here. Seems very odd.
 
Last edited:
  • Like
Reactions: Elfear and psolord

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
I get the feeling that if 1080Ti people want to stay in a similar price bracket, they may be stuck with the 3060 cards or maybe a lower model 3070 for about a 15% performance uplift. I'd think that after Turing, Nvidia would want to offer people an upgrade. I can't imagine them leaving so many people with a 1080Ti without a viable upgrade path yet again. Does it make sense for them to do that? I honestly don't know. If the only impressive upgrade comes in at $1000 or higher, then they can't be interested in making a sale to their traditional enthusiast crowd. They would clearly be moving on to a different customer altogether. It just seems odd to me. Even if the 2080Ti had been $700, I would have past it up. It didn't bring enough performance to the table over what I already had. If I'm stuck with a 3070Ti at $800 that is basically on par with a 2080Ti, then I'll be skipping 2 entire generations in a row, which I don't recall ever doing before. I can't be alone here. Seems very odd.

You may be correct, although I have to say that speaks more to how great the 1080TI was more so than anything else. I sold mine as the mining boom was winding down and bought a 2080TI. Just sold the 2080TI and am using an AMD 580 until the 3000 series drops. Random black screens and weirdness along with a few friends' experiences with the 5700s make me feel like I'll go green again even though I've probably owned 3x as many AMD/ATI GPUs over the years as NVIDIA.

Enough rambling, I hope you're wrong and we see competition, better pricing and some beastly GPUs that can finally push 1440p 144hz and 4k properly.
 

jpiniero

Lifer
Oct 1, 2010
16,821
7,259
136
The ram thing... perhaps they are saving the 2 GB per for Super models next year, if more Ampere is what they are planning. The other thought is that the 'thing on the back' is some NAND, although that sounds stupid.

Any word on Samsung yields?

Hopefully they gave NV a sweet enough of a deal and have yields high enough they can really aggressively price these parts, at least their 3070 line, aggressively against the next gen consoles and RDNA 2.

Yields should be very good, but they are probably putting a better cooler across the stack to help compensate for not having SS7.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
I get the feeling that if 1080Ti people want to stay in a similar price bracket, they may be stuck with the 3060 cards or maybe a lower model 3070 for about a 15% performance uplift. I'd think that after Turing, Nvidia would want to offer people an upgrade. I can't imagine them leaving so many people with a 1080Ti without a viable upgrade path yet again. Does it make sense for them to do that? I honestly don't know. If the only impressive upgrade comes in at $1000 or higher, then they can't be interested in making a sale to their traditional enthusiast crowd. They would clearly be moving on to a different customer altogether. It just seems odd to me. Even if the 2080Ti had been $700, I would have past it up. It didn't bring enough performance to the table over what I already had. If I'm stuck with a 3070Ti at $800 that is basically on par with a 2080Ti, then I'll be skipping 2 entire generations in a row, which I don't recall ever doing before. I can't be alone here. Seems very odd.

Times they are a changing. Prior history doesn't mean squat anymore when it comes to GPU's. If you are stuck on a budget of what the high end used to cost then you aren't buying high end anymore. Raise your budget or don't buy for awhile. Until NV isn't making money they aren't changing course.

Is what it is. At this point people like you aren't hurting their bottom line.
 
  • Like
Reactions: coercitiv

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
Times they are a changing. Prior history doesn't mean squat anymore when it comes to GPU's. If you are stuck on a budget of what the high end used to cost then you aren't buying high end anymore. Raise your budget or don't buy for awhile. Until NV isn't making money they aren't changing course.

Is what it is. At this point people like you aren't hurting their bottom line.

What's the most you would spend on a GPU? Everyone has a limit. The 2080Ti was $1200. How high would you personally go? I'll guess and see how close I can get. I guess your limit is right around...$3500 for a single GPU. Am I close?
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
What's the most you would spend on a GPU? Everyone has a limit. The 2080Ti was $1200. How high would you personally go? I'll guess and see how close I can get. I guess your limit is right around...$3500 for a single GPU. Am I close?

Nah, $1500 is my limit. I'll probably do that this generation, but I'm really into my C6 Corvette right now and will drop bank on a new stereo system later this fall. I'm slowly moving out of the gaming business on PC.

Anyway, I realize everyone has a limit and some have obviously reached that limit. However, pining for what once was in GPU pricing is a losing battle.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Is what it is. At this point people like you aren't hurting their bottom line.

But it did hurt their bottom line. Both Nvidia and Apple. Both had to reduce prices.

1080 Ti only looked good because the initial 1080 wasn't that good either. People also forget that 1080's price raise and FE happened before the mining boom. The timing between the two was close enough that people just got an excuse to blame it on later.

Gone are the days where the smaller competitor offered much better perf/$. They are better, but follow the pricing of the expensive competitor. In that way, they wisened up.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
But it did hurt their bottom line. Both Nvidia and Apple. Both had to reduce prices.

1080 Ti only looked good because the initial 1080 wasn't that good either. People also forget that 1080's price raise and FE happened before the mining boom. The timing between the two was close enough that people just got an excuse to blame it on later.

Gone are the days where the smaller competitor offered much better perf/$. They are better, but follow the pricing of the expensive competitor. In that way, they wisened up.


Hmmm, Nvidia seems to be doing well in net profit. The high-end for GPU's isn't the money maker. The high-end is all about mindshare and Nvidia has that in spades. The low to mid-end is where money is made. AMD/Nvidia compete well there and make bank with OEM's and consumers at the under $350 mark.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Hmmm, Nvidia seems to be doing well in net profit. The high-end for GPU's isn't the money maker. The high-end is all about mindshare and Nvidia has that in spades. The low to mid-end is where money is made. AMD/Nvidia compete well there and make bank with OEM's and consumers at the under $350 mark.

Yes but they did have to drop the prices, same as with Apple. A month or two after news started emerging people were complaining about Turing and iPhone sales drastically declined. I forgot about Apple but for Nvidia it affected their quaterly earnings.

Halo products affect the whole product line so their effect is more than just their revenue contribution. This is why I believe in low end systems they use dGPUs that are not better than the iGPUs that are in the CPUs that are paired with them. All about that branding.

Also for a small competitor like AMD low-mid might make majority of revenue but for bigger companies they focus on premium because that's where it really makes the impact - Margin and Revenue. Focusing on low end and value is eventually a losing proposition because nothing other than prices you can use to stand out.

Maybe with Ampere, Nvidia thinks price increase will be accepted because the premiere feature - Ray Tracing will actually be usable while on Turing it wasn't(so you had useless feature on top of the high prices). I see Turing as either Geforce 256 or Geforce 3 and Ampere as Geforce 2 or Geforce 4. The former introduced the technology but latter made it matter.

If the pricing is that high though($800 for RTX 2080 Ti performance) it could very likely be they miscalculated it and they'll take a hit again.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
Nvidia "mindshare" is waning. It's being replaced with each generation with more and more frustration. We've seen this in other areas of PC tech recently As soon as a viable alternative was presented, people rushed to buy it and they still do, even though the other side still has faster products in many workloads for a similar price. Mindshare has its foundation in the forums, on social media, on youtube, and in games on-line. All I've heard lately is frustration with Nvidia pricing and lack of reasonable upgrade options in the upper end. I ask people in-game "who's getting an RTX 3000 series?" Immediately the replies are "people with money" or some other negative sounding response. They didn't even bother to consider there might be a cheaper option than the top models because they have been conditioned to see Nvidia as nothing but too expensive and highly exclusive. They often don't sound excited about the tech at all. They just feel like Nvidia isn't for them even though they used to be. Once that hate-coaster gets rolling, it's going to plow over Nvidia's "mindshare" as soon as there is a viable alternative. They've been letting the dam build pressure and all someone has to do is hit it hard enough to cause a crack. We've seen this happen just recently in other areas.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
You may be correct, although I have to say that speaks more to how great the 1080TI was more so than anything else. I sold mine as the mining boom was winding down and bought a 2080TI. Just sold the 2080TI and am using an AMD 580 until the 3000 series drops. Random black screens and weirdness along with a few friends' experiences with the 5700s make me feel like I'll go green again even though I've probably owned 3x as many AMD/ATI GPUs over the years as NVIDIA.

Enough rambling, I hope you're wrong and we see competition, better pricing and some beastly GPUs that can finally push 1440p 144hz and 4k properly.

Second time you have posted this in this very thread. But I have not seen any threaded started to work out whatever issue you are seeing. Polaris cards are in no way known for black screen issues. Which means it could be a driver corruption, or a hardware issue. As for the 5700, plenty of driver issues early on, but those have all been fixed. Its not very often that a GPU company moves to an entirely new ISA. Prior to Navi, AMD has not done it since the 7000 series. nVidia has not done it since Kepler. Driver growing pains are to be expected, if not exactly wanted.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Nvidia "mindshare" is waning. It's being replaced with each generation with more and more frustration.

We're definitely in those times where lots of interesting things happen. GPU gains across many companies are going to be the largest in decades, we have multiple competition in the CPU space, and people are pushing back against established large companies.

I know its more than just tech but for the purpose of not being off topic I'll not go further.

It's too bad because other than the rumored pricing Ampere sounds like a fantastic product. I get the hype around Ray Tracing too. If they improve it so the losses are small than why the freak not?
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Second time you have posted this in this very thread. But I have not seen any threaded started to work out whatever issue you are seeing. Polaris cards are in no way known for black screen issues. Which means it could be a driver corruption, or a hardware issue. As for the 5700, plenty of driver issues early on, but those have all been fixed. Its not very often that a GPU company moves to an entirely new ISA. Prior to Navi, AMD has not done it since the 7000 series. nVidia has not done it since Kepler. Driver growing pains are to be expected, if not exactly wanted.

This is a big thread derail, I guess I can't mention AMD in anything but a positive light despite the fact that I've used AMD GPUs in more systems than I can count... I'll make sure to only talk about NVIDIA from here on out. ;)

I ran DDU and a clean install of drivers... twice, neither solved the problem. I'd come back to my computer and it would be on but screens not receiving any input and no ability to remote into it. The only temp fix that has worked is using a custom fan profile via afterburner that keeps the fans from spinning all the way down, which was a recommendation I found online. Could be hardware related for sure, although it's 100% stable when in use.

Suffice it to say I've also had increased issues with games crashing (especially older games) on not just my 580 but the 480 and 570 in my kid's boxes. My experience is obviously anecdotal and I don't think it's annoying enough to pay significantly more for NVIDIA, but if performance is in the same ballpark at a given price, NVIDIA is going to get my money for my personal box.
 

Konan

Senior member
Jul 28, 2017
360
291
106
So what are your guys feelings with paper launch with GDDR6X then?

When the GTX 1080 came out Nvidia used the brand new GDDR5X, however, they then decided to use just GDDR5 for the GTX 1070. Are we going to see a repeat?
Is GDDR6X just sampling so supply is short?

Micron’s roadmap also highlights the potential for a 16Gb GDDR6X in 2021 with the ability to reach up to 24Gb/s.
I read this and think if there are any 16GB cards then come later on in the new calendar year.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
So what are your guys feelings with paper launch with GDDR6X then?

When the GTX 1080 came out Nvidia used the brand new GDDR5X, however, they then decided to use just GDDR5 for the GTX 1070. Are we going to see a repeat?
Is GDDR6X just sampling so supply is short?


I read this and think if there are any 16GB cards then come later on in the new calendar year.

I am willing to bet it wont be 6X across the board. Lower end cards just don't need it, so why put more expensive memory on them? It is really strange that GDDR6X never actually had a launch. Maybe its because computex didn't happen. But a press release would have sufficed.
 

beginner99

Diamond Member
Jun 2, 2009
5,318
1,763
136
I can't imagine them leaving so many people with a 1080Ti without a viable upgrade path yet again. Does it make sense for them to do that? I honestly don't know.

I agree. With the insane pricing sales will simply go down overall. People will have to choose slower models which simply means it takes longer for an upgrade to be worth it. At this point I wonder if they even made any money from the TU102 die? Yes Quadros sell at much higher prices but then that market is a lot smaller than the gaming market.

The high-end is all about mindshare and Nvidia has that in spades. The low to mid-end is where money is made.

The only thing that makes sense for the insane pricing is that people are willing to pay $300-$400 for tiny entry-level dies (cards). Looking at you, Navi10. So the 102 die is mostly there to inflate all the other prices. If your top card is $1200, the next once can't just be $500, it must be $800 at least. And so forth. So all levels get inflated prices. So the 102 die doesn't even need to be profitable in itself because it fuels the profits made of overpriced tiny 250nm dies. if every die in the massmarket sells for $50-$100 more, that's a huge win.
 

A///

Diamond Member
Feb 24, 2017
4,351
3,160
136
I got real lucky with a misprinted price sticker on a 8800GTX back when they'd just come out. I remember the store manager wasn't really happy with the stock guys who'd messed up. I didn't think GPUs would get much more expensive than the then MSRP of the 8800GTX. 2070 launched at that same price, but the Super came in a hundred bucks cheaper. The GPU landscape needs some serious reshuffling and get prices down again. It's a buyer's market, but some much needed sanity is in order.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
84 SM Ampere GPU - 40% Rasterization perf. improvement, over RTX 2080 Ti, 375W TGP - RTX 3090.
68 SM GPU - 10-15% rasterization improvement over RTX 2080 Ti, 320W TGP - RTX 3080.
48 SM GPU - 10% rasterization improvement over RTX 2080 Super - RTX 3070 Ti.
40 SM GPU - RTX 2080 performance - RTX 3070.
36 SM GPU - RTX 2070 Super performance - RTX 3060 Ti
30 SM GPU - RTX 2070 performance - RTX 3060
24 SM GPU - RTX 2060 performance - RTX 3050 Ti.
20 SM GPU - GTX 1660 Ti performance - RTX 3050.

RTX 3050 - 159$.
RTX 3050 Ti - 199$.
RTX 3060 - 249$
RTX 3060 Ti - 299$
RTX 3070 - 349$
RTX 3070 Ti - 449$
RTX 3080 - 599$.
RTX 3090 - 999$.

This is my prediction on price, and performance targets for Ampere GPUs.
 
  • Like
Reactions: Tarkin77

jpiniero

Lifer
Oct 1, 2010
16,821
7,259
136
84 SM Ampere GPU - 40% Rasterization perf. improvement, over RTX 2080 Ti, 375W TGP - RTX 3090.
68 SM GPU - 10-15% rasterization improvement over RTX 2080 Ti, 320W TGP - RTX 3080.
48 SM GPU - 10% rasterization improvement over RTX 2080 Super - RTX 3070 Ti.
40 SM GPU - RTX 2080 performance - RTX 3070.
36 SM GPU - RTX 2070 Super performance - RTX 3060 Ti
30 SM GPU - RTX 2070 performance - RTX 3060
24 SM GPU - RTX 2060 performance - RTX 3050 Ti.
20 SM GPU - GTX 1660 Ti performance - RTX 3050.

RTX 3050 - 159$.
RTX 3050 Ti - 199$.
RTX 3060 - 249$
RTX 3060 Ti - 299$
RTX 3070 - 349$
RTX 3070 Ti - 449$
RTX 3080 - 599$.
RTX 3090 - 999$.

This is my prediction on price, and performance targets for Ampere GPUs.

Prices and performance are too low.