Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 86 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

jpiniero

Lifer
Oct 1, 2010
15,223
5,768
136

Jensen is also doing a GTC Keynote on October 5th. Probably Ampere Quadros is the focus but they could also talk about the second tier Ampere Gaming.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
It'd have to be either 2 GB chips or 2 1 GB chips sharing the same bus.

It does make sense from a financial standpoint; they can charge something in the range of the Ti .

Micron stated the cards would all have a base of 12GB. If this is true (And it should be given the source) doubling the memory would bring you to 24GB, not 20. The only way to end up with 20 is to gimp the bus in some way. And while nVidia has done this before (GTX 970), not sure they want to go down that road again.

Now if the non-founders cards do have 10GB, that would mean they are running a different memory bus than the FE cards, which would be really weird and something I dont think we have ever seen as it would mean AIB cards would have less memory bandwidth than the FE cards.
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
15,223
5,768
136
Micron stated the cards would all have a base of 12GB. If this is true (And it should be given the source) doubling the memory would bring you to 24GB, not 20. The only way to end up with 20 is to gimp the bug in some way. And while nVidia has done this before (GTX 970), not sure they want to go down that road again.

The 3080 is 320 bit, so 10 (or 20). The 3090 is 384, so 12 or 24.
 
  • Like
Reactions: Stuka87 and Krteq

Tup3x

Golden Member
Dec 31, 2016
1,086
1,084
136
Ahh, that tweet did specifically state the 3080 at the bottom. Would still be very strange to put that much memory on that card.
I'd imagine that they'll launch 20GB version if AMD has cards with more vram. For marketing bigger number is better.
 

FaaR

Golden Member
Dec 28, 2007
1,056
412
136
The only way to end up with 20 is to gimp the bug in some way. And while nVidia has done this before (GTX 970), not sure they want to go down that road again.
They've done it in 1080Ti and 2080Ti as well; 11GB instead of 12GB...

1070 gimp was more serious, because accessing the last GB of RAM on that GPU led to a huge bandwidth loss due to the way memory controller hardware was designed internally. That's what caused all the online ruckus with subsequent egg on NV's face that time around; they wouldn't repeat that mistake for sure years later... :)
 

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
If that slide proves accurate, I'm going to skip this release and wait for the 3080Ti. I'm not paying $800 for a non-Ti card and no way I'd ever pay $1400 for any GPU. I don't think this release is going to work for me. Also, if the slide is accurate, look at what they've done to x70 buyers! $500 for a 70-class card? If I said what I really thought about that I'd get banned from Anandtech forever.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
If that slide proves accurate, I'm going to skip this release and wait for the 3080Ti. I'm not paying $800 for a non-Ti card and no way I'd ever pay $1400 for any GPU. I don't think this release is going to work for me. Also, if the slide is accurate, look at what they've done to x70 buyers! $500 for a 70-class card? If I said what I really thought about that I'd get banned from Anandtech forever.

From the sounds of it, there wont be a 3080Ti. The 3090 fills in that spot.
 

MrTeal

Diamond Member
Dec 7, 2003
3,614
1,816
136
$800 would be tough to swallow for the 3080 if that's MSRP and not FE pricing. It wouldn't be surprising, but I wouldn't be rushing out the door to replace my 1080 Ti at that price if it is a small uplift over the 2080 Ti.
 
  • Like
Reactions: Elfear and moonbogg

CastleBravo

Member
Dec 6, 2019
120
271
136
If that slide proves accurate, I'm going to skip this release and wait for the 3080Ti. I'm not paying $800 for a non-Ti card and no way I'd ever pay $1400 for any GPU. I don't think this release is going to work for me. Also, if the slide is accurate, look at what they've done to x70 buyers! $500 for a 70-class card? If I said what I really thought about that I'd get banned from Anandtech forever.

No idea if those prices are real, but I guarantee you that if AMD is late to the party, NV are going to fleece the early adopter fanboy edition customers as usual. Unless RDNA2 is a huge flop, I would expect those prices to come down around holiday season.
 
  • Like
Reactions: moonbogg

JasonLD

Senior member
Aug 22, 2017
487
447
136
1070 gimp was more serious, because accessing the last GB of RAM on that GPU led to a huge bandwidth loss due to the way memory controller hardware was designed internally. That's what caused all the online ruckus with subsequent egg on NV's face that time around; they wouldn't repeat that mistake for sure years later... :)

Uh....are you talking about 970?
 
  • Like
Reactions: psolord and ozzy702

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
No idea if those prices are real, but I guarantee you that if AMD is late to the party, NV are going to fleece the early adopter fanboy edition customers as usual. Unless RDNA2 is a huge flop, I would expect those prices to come down around holiday season.

Then I got no problem waiting for RDNA2 and the holiday season. Truly, at those prices, waiting is more than fine with me. I've had it with this ridiculous crap. Also, if my 1080Ti replacement now costs $1400, then a console it really is. They are going to be awesome this time and I'd have a blast playing with my kids. OMG imagine spending half as much on a console as a single GPU and having a blast, lol. Actually, I could just about buy a console AND a new TV for not much more than $1400. They have to be kidding. The value proposition is filthy.
 
  • Like
Reactions: raghu78 and ozzy702

Konan

Senior member
Jul 28, 2017
360
291
106
No idea if those prices are real, but I guarantee you that if AMD is late to the party, NV are going to fleece the early adopter fanboy edition customers as usual. Unless RDNA2 is a huge flop, I would expect those prices to come down around holiday season.

I suspect that paper launch because of short supply of GDDR6X and the time to take to ramp that up properly. Also the cost of the new memory I suspect is expensive. So putting it together will mean short supply keeping the prices high for a while...
 

Bouowmx

Golden Member
Nov 13, 2016
1,147
551
146
I guess GeForce RTX 3070 [Ti] for 500 USD is alright. Almost 2080 Ti performance, more memory, a lot lower price. This tier may again be the "balanced" upper-end tier, like the 2070 S.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Then I got no problem waiting for RDNA2 and the holiday season. Truly, at those prices, waiting is more than fine with me. I've had it with this ridiculous crap. Also, if my 1080Ti replacement now costs $1400, then a console it really is. They are going to be awesome this time and I'd have a blast playing with my kids. OMG imagine spending half as much on a console as a single GPU and having a blast, lol. Actually, I could just about buy a console AND a new TV for not much more than $1400. They have to be kidding. The value proposition is filthy.

If Nvidia thinks they can continue to increase GPU prices with Ampere they are going to be in for a rude awakening. A 12 TF Xbox Series X with 8 Zen2 cores at 3.66 Ghz (with SMT) is going to provide better gaming perf than a RTX 3060 and maybe even match a RTX 3070 if we assume the current Ampere rumours are true. At a rumoured price of $500 Series X is going to provide a level of CPU and GPU performance unseen in a game console. Then comes the actual competition with RDNA2 on PC which is going to be aggressive given what we know about RDNA2 area and power efficiency (from the claimed 1x GPU power draw vs Xbox One X).
 

jpiniero

Lifer
Oct 1, 2010
15,223
5,768
136
If Nvidia thinks they can continue to increase GPU prices with Ampere they are going to be in for a rude awakening. A 12 TF Xbox Series X with 8 Zen2 cores at 3.66 Ghz (with SMT) is going to provide better gaming perf than a RTX 3060 and maybe even match a RTX 3070 if we assume the current Ampere rumours are true. At a rumoured price of $500 Series X is going to provide a level of CPU and GPU performance unseen in console. Then comes the actual competition with RDNA2 on PC which is going to be aggressive given what we know about RDNA2 power efficiency from the claimed 1x GPU power draw vs Xbox One X.

At $500 Microsoft would be losing money, probably a decent amount. AMD with the dGPUs is not going to do that obviously. People don't seem to understand that part of the console business I guess.

I would expect high prices from AMD.
 
Last edited:

CastleBravo

Member
Dec 6, 2019
120
271
136
At $500 Microsoft would be losing money, probably a decent amount. AMD with the dGPUs is not going to do that obviously. People don't seem to understand that part of the console business I guess.

I would expect high prices from AMD.

True, but people aren't going to shell out for an NV card instead of buying an Xbox just because MS is selling them at zero margin or a slightly loss, so his point kinda stands.