Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 82 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,523
3,037
136
Well the chart from Chiphell could be completely user made. That said a 3060 with a proposed 6GB on the same level as a 1080TI 11GB is interesting.
It's very likely that graph is fake. I would be very surprised If RTX3060 had only 6GB Vram and such a card wouldn't be futureproof. Everyone remembers what happened with GTX 1060 3GB.
 
  • Like
Reactions: Mopetar

JasonLD

Senior member
Aug 22, 2017
487
447
136
In my opinion I don't think 2080Ti is bottlenecked and I don't believe 3070Ti has only 512GB/s If the performance is on par with 2080Ti.

Well, that is assuming if there is no more room for improvement in terms of memory efficiency. While I don't think that graph is real, we would just need performance figures from 3070 and 3080 next month to guess where 3070ti will stand
 

Karnak

Senior member
Jan 5, 2017
399
767
136
That might be enough. Say the 3070 Ti FE actually hits that... same shader count as 2080 Super, and it's 20% faster. 9% frequency boost plus 10% IPC gain gets you pretty close.
But it won't be a frequency boost since Turing has the same 2100MHz core clock limit. That limit has nothing to do with real clock speeds.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,523
3,037
136
Well, that is assuming if there is no more room for improvement in terms of memory efficiency. While I don't think that graph is real, we would just need performance figures from 3070 and 3080 next month to guess where 3070ti will stand
I didn't say there won't(couldn't) be any improvement in terms of memory efficiency, I just don't think It would be 20-25% better compared to the previous generation.
 

CakeMonster

Golden Member
Nov 22, 2012
1,502
659
136
Quick math on the 4K figures, 20% over 2080Ti for the 3080, 40% over 2080Ti for the 3090.

Nope, I obviously don't believe its real, but at least the numbers are plausible.
 
  • Like
Reactions: Konan

GodisanAtheist

Diamond Member
Nov 16, 2006
7,166
7,666
136
Any word on Samsung yields?

Hopefully they gave NV a sweet enough of a deal and have yields high enough they can really aggressively price these parts, at least their 3070 line, aggressively against the next gen consoles and RDNA 2.

Consumers are desperately in need of a price war here, and so long as supply factors work and the competition shows up, we may get a good round of cards.
 
  • Like
Reactions: ozzy702

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
I get the feeling that if 1080Ti people want to stay in a similar price bracket, they may be stuck with the 3060 cards or maybe a lower model 3070 for about a 15% performance uplift. I'd think that after Turing, Nvidia would want to offer people an upgrade. I can't imagine them leaving so many people with a 1080Ti without a viable upgrade path yet again. Does it make sense for them to do that? I honestly don't know. If the only impressive upgrade comes in at $1000 or higher, then they can't be interested in making a sale to their traditional enthusiast crowd. They would clearly be moving on to a different customer altogether. It just seems odd to me. Even if the 2080Ti had been $700, I would have past it up. It didn't bring enough performance to the table over what I already had. If I'm stuck with a 3070Ti at $800 that is basically on par with a 2080Ti, then I'll be skipping 2 entire generations in a row, which I don't recall ever doing before. I can't be alone here. Seems very odd.
 
Last edited:
  • Like
Reactions: Elfear and psolord

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
I get the feeling that if 1080Ti people want to stay in a similar price bracket, they may be stuck with the 3060 cards or maybe a lower model 3070 for about a 15% performance uplift. I'd think that after Turing, Nvidia would want to offer people an upgrade. I can't imagine them leaving so many people with a 1080Ti without a viable upgrade path yet again. Does it make sense for them to do that? I honestly don't know. If the only impressive upgrade comes in at $1000 or higher, then they can't be interested in making a sale to their traditional enthusiast crowd. They would clearly be moving on to a different customer altogether. It just seems odd to me. Even if the 2080Ti had been $700, I would have past it up. It didn't bring enough performance to the table over what I already had. If I'm stuck with a 3070Ti at $800 that is basically on par with a 2080Ti, then I'll be skipping 2 entire generations in a row, which I don't recall ever doing before. I can't be alone here. Seems very odd.

You may be correct, although I have to say that speaks more to how great the 1080TI was more so than anything else. I sold mine as the mining boom was winding down and bought a 2080TI. Just sold the 2080TI and am using an AMD 580 until the 3000 series drops. Random black screens and weirdness along with a few friends' experiences with the 5700s make me feel like I'll go green again even though I've probably owned 3x as many AMD/ATI GPUs over the years as NVIDIA.

Enough rambling, I hope you're wrong and we see competition, better pricing and some beastly GPUs that can finally push 1440p 144hz and 4k properly.
 

jpiniero

Lifer
Oct 1, 2010
15,223
5,768
136
The ram thing... perhaps they are saving the 2 GB per for Super models next year, if more Ampere is what they are planning. The other thought is that the 'thing on the back' is some NAND, although that sounds stupid.

Any word on Samsung yields?

Hopefully they gave NV a sweet enough of a deal and have yields high enough they can really aggressively price these parts, at least their 3070 line, aggressively against the next gen consoles and RDNA 2.

Yields should be very good, but they are probably putting a better cooler across the stack to help compensate for not having SS7.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
I get the feeling that if 1080Ti people want to stay in a similar price bracket, they may be stuck with the 3060 cards or maybe a lower model 3070 for about a 15% performance uplift. I'd think that after Turing, Nvidia would want to offer people an upgrade. I can't imagine them leaving so many people with a 1080Ti without a viable upgrade path yet again. Does it make sense for them to do that? I honestly don't know. If the only impressive upgrade comes in at $1000 or higher, then they can't be interested in making a sale to their traditional enthusiast crowd. They would clearly be moving on to a different customer altogether. It just seems odd to me. Even if the 2080Ti had been $700, I would have past it up. It didn't bring enough performance to the table over what I already had. If I'm stuck with a 3070Ti at $800 that is basically on par with a 2080Ti, then I'll be skipping 2 entire generations in a row, which I don't recall ever doing before. I can't be alone here. Seems very odd.

Times they are a changing. Prior history doesn't mean squat anymore when it comes to GPU's. If you are stuck on a budget of what the high end used to cost then you aren't buying high end anymore. Raise your budget or don't buy for awhile. Until NV isn't making money they aren't changing course.

Is what it is. At this point people like you aren't hurting their bottom line.
 
  • Like
Reactions: coercitiv

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
Times they are a changing. Prior history doesn't mean squat anymore when it comes to GPU's. If you are stuck on a budget of what the high end used to cost then you aren't buying high end anymore. Raise your budget or don't buy for awhile. Until NV isn't making money they aren't changing course.

Is what it is. At this point people like you aren't hurting their bottom line.

What's the most you would spend on a GPU? Everyone has a limit. The 2080Ti was $1200. How high would you personally go? I'll guess and see how close I can get. I guess your limit is right around...$3500 for a single GPU. Am I close?
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
What's the most you would spend on a GPU? Everyone has a limit. The 2080Ti was $1200. How high would you personally go? I'll guess and see how close I can get. I guess your limit is right around...$3500 for a single GPU. Am I close?

Nah, $1500 is my limit. I'll probably do that this generation, but I'm really into my C6 Corvette right now and will drop bank on a new stereo system later this fall. I'm slowly moving out of the gaming business on PC.

Anyway, I realize everyone has a limit and some have obviously reached that limit. However, pining for what once was in GPU pricing is a losing battle.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
Is what it is. At this point people like you aren't hurting their bottom line.

But it did hurt their bottom line. Both Nvidia and Apple. Both had to reduce prices.

1080 Ti only looked good because the initial 1080 wasn't that good either. People also forget that 1080's price raise and FE happened before the mining boom. The timing between the two was close enough that people just got an excuse to blame it on later.

Gone are the days where the smaller competitor offered much better perf/$. They are better, but follow the pricing of the expensive competitor. In that way, they wisened up.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
But it did hurt their bottom line. Both Nvidia and Apple. Both had to reduce prices.

1080 Ti only looked good because the initial 1080 wasn't that good either. People also forget that 1080's price raise and FE happened before the mining boom. The timing between the two was close enough that people just got an excuse to blame it on later.

Gone are the days where the smaller competitor offered much better perf/$. They are better, but follow the pricing of the expensive competitor. In that way, they wisened up.


Hmmm, Nvidia seems to be doing well in net profit. The high-end for GPU's isn't the money maker. The high-end is all about mindshare and Nvidia has that in spades. The low to mid-end is where money is made. AMD/Nvidia compete well there and make bank with OEM's and consumers at the under $350 mark.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
Hmmm, Nvidia seems to be doing well in net profit. The high-end for GPU's isn't the money maker. The high-end is all about mindshare and Nvidia has that in spades. The low to mid-end is where money is made. AMD/Nvidia compete well there and make bank with OEM's and consumers at the under $350 mark.

Yes but they did have to drop the prices, same as with Apple. A month or two after news started emerging people were complaining about Turing and iPhone sales drastically declined. I forgot about Apple but for Nvidia it affected their quaterly earnings.

Halo products affect the whole product line so their effect is more than just their revenue contribution. This is why I believe in low end systems they use dGPUs that are not better than the iGPUs that are in the CPUs that are paired with them. All about that branding.

Also for a small competitor like AMD low-mid might make majority of revenue but for bigger companies they focus on premium because that's where it really makes the impact - Margin and Revenue. Focusing on low end and value is eventually a losing proposition because nothing other than prices you can use to stand out.

Maybe with Ampere, Nvidia thinks price increase will be accepted because the premiere feature - Ray Tracing will actually be usable while on Turing it wasn't(so you had useless feature on top of the high prices). I see Turing as either Geforce 256 or Geforce 3 and Ampere as Geforce 2 or Geforce 4. The former introduced the technology but latter made it matter.

If the pricing is that high though($800 for RTX 2080 Ti performance) it could very likely be they miscalculated it and they'll take a hit again.
 

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
Nvidia "mindshare" is waning. It's being replaced with each generation with more and more frustration. We've seen this in other areas of PC tech recently As soon as a viable alternative was presented, people rushed to buy it and they still do, even though the other side still has faster products in many workloads for a similar price. Mindshare has its foundation in the forums, on social media, on youtube, and in games on-line. All I've heard lately is frustration with Nvidia pricing and lack of reasonable upgrade options in the upper end. I ask people in-game "who's getting an RTX 3000 series?" Immediately the replies are "people with money" or some other negative sounding response. They didn't even bother to consider there might be a cheaper option than the top models because they have been conditioned to see Nvidia as nothing but too expensive and highly exclusive. They often don't sound excited about the tech at all. They just feel like Nvidia isn't for them even though they used to be. Once that hate-coaster gets rolling, it's going to plow over Nvidia's "mindshare" as soon as there is a viable alternative. They've been letting the dam build pressure and all someone has to do is hit it hard enough to cause a crack. We've seen this happen just recently in other areas.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
You may be correct, although I have to say that speaks more to how great the 1080TI was more so than anything else. I sold mine as the mining boom was winding down and bought a 2080TI. Just sold the 2080TI and am using an AMD 580 until the 3000 series drops. Random black screens and weirdness along with a few friends' experiences with the 5700s make me feel like I'll go green again even though I've probably owned 3x as many AMD/ATI GPUs over the years as NVIDIA.

Enough rambling, I hope you're wrong and we see competition, better pricing and some beastly GPUs that can finally push 1440p 144hz and 4k properly.

Second time you have posted this in this very thread. But I have not seen any threaded started to work out whatever issue you are seeing. Polaris cards are in no way known for black screen issues. Which means it could be a driver corruption, or a hardware issue. As for the 5700, plenty of driver issues early on, but those have all been fixed. Its not very often that a GPU company moves to an entirely new ISA. Prior to Navi, AMD has not done it since the 7000 series. nVidia has not done it since Kepler. Driver growing pains are to be expected, if not exactly wanted.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
Nvidia "mindshare" is waning. It's being replaced with each generation with more and more frustration.

We're definitely in those times where lots of interesting things happen. GPU gains across many companies are going to be the largest in decades, we have multiple competition in the CPU space, and people are pushing back against established large companies.

I know its more than just tech but for the purpose of not being off topic I'll not go further.

It's too bad because other than the rumored pricing Ampere sounds like a fantastic product. I get the hype around Ray Tracing too. If they improve it so the losses are small than why the freak not?