Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 84 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
SLI is dead. Take it from someone who has had 2 2080ti's for 2 years up until last week when I sold one. Maybe 4 or 5 new releases supported it in that time frame that I can think of. One of which was Quake 2 RTX

With the responsibility on developers for multi-gpu with the new API's, its over. Dev's just don't add support for it outside of whoever the dev is for the new Tomb Raider games
 

Konan

Senior member
Jul 28, 2017
360
291
106
That rumor is a complete BS.

Want to know why? Look at GA100 chip. Gaming cards are exactly the same architecture, with specific tweaks for gaming, and leaving out the GEMM stuff out of it.


Many months ago I have written here that next gen Gaming cards from Nvidia are not great from Rasterization perspective, but they are vastly better from Ray Tracing perspective.

Ray Tracing is what has consumed die area in next gen gaming cards. People have forgotten about this fact, it appears.

How do you account for the TimeSpy Extreme results from KatCorgi and KopiteKimi then?? Did you see the % differences. The thing is this, you can extrapolate the TSE Turing scores (from that Synthetic Benchmark) and correlate to gaming score quite well with a decent variance. Do the same again for what Ampere is coming from the Twitter leakers, include massive variance/buffer and you get way more than what you are speculating.

That said, would love the pricing you are forecasting ;)
 

Krteq

Senior member
May 22, 2015
993
672
136
Aren't there some rumors that the 3090 might have 2 die? Could we be seeing a return to the traditional xx90 models, aka SLI on 1 card?

A die on either side of the PCB with separate heatpipes and a single fan for each???
Wait, where on leaked RTX 3090 PCB pictures you can see 2 GPUs?
 

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
I don't think a $2000 card will even get much comparison time on youtube channels or anywhere else. It will be completely irrelevant to the world of gaming, lol. It would get reviewed once, and some channels probably won't even include it in their benchmarks after that. Noone will pretend like people care about a $2000 card. Come on now, they went from 980Ti @ $650 to 1080Ti @ $700 to 2080Ti @ $1200. No way they go to 3080 @ $1500 and 3090 @ $2000 lol. I kind of hope they do just so I can enjoy the backlash and drama.
 
  • Like
Reactions: psolord

jpiniero

Lifer
Oct 1, 2010
15,223
5,768
136
I don't think a $2000 card will even get much comparison time on youtube channels or anywhere else. It will be completely irrelevant to the world of gaming, lol. It would get reviewed once, and some channels probably won't even include it in their benchmarks after that. Noone will pretend like people care about a $2000 card. Come on now, they went from 980Ti @ $650 to 1080Ti @ $700 to 2080Ti @ $1200. No way they go to 3080 @ $1500 and 3090 @ $2000 lol. I kind of hope they do just so I can enjoy the backlash and drama.

I think the $2000 for the 3090 is wrong. I think it'll be $1500.
 

Konan

Senior member
Jul 28, 2017
360
291
106
I think the $2000 for the 3090 is wrong. I think it'll be $1500.

I hope it would be closer, if not that. I think they are looking at the higher end stack in a way like Threadripper pricing with some clear pricing segmentation and evolving structure that way.

The current Titan RTX is $2500 and the RTX 2080 Ti $1200. I suspect they want to fill that segment between with the 3090.

Edit: One point is though those Colorful cards are the Vulcan brand which is higher end so I would expect maybe $1799.....?
 

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
Wait, where on leaked RTX 3090 PCB pictures you can see 2 GPUs?
I also saw this speculation elsewhere. What if there really is 2 die, not a co-processor, but another full die on the back for a traditional xx90 type of product.


"Shots of NVIDIA’s RTX 3090 PCB (or perhaps the 3080) have surfaced on the Bilibili forums (now taken down), showing an interesting design. The VRAM dies are on the back-side of the PCB along with a second processor (co-processor?). Furthermore, the NVLink connector is different from the one on the Turing GPUs."
https://www.hardwaretimes.com/nvidi...eportedly-surfaces-double-sided-with-2-chips/
 
  • Like
Reactions: Konan

MrTeal

Diamond Member
Dec 7, 2003
3,614
1,816
136
I also saw this speculation elsewhere. What if there really is 2 die, not a co-processor, but another full die on the back for a traditional xx90 type of product.


"Shots of NVIDIA’s RTX 3090 PCB (or perhaps the 3080) have surfaced on the Bilibili forums (now taken down), showing an interesting design. The VRAM dies are on the back-side of the PCB along with a second processor (co-processor?). Furthermore, the NVLink connector is different from the one on the Turing GPUs."
https://www.hardwaretimes.com/nvidi...eportedly-surfaces-double-sided-with-2-chips/
All of that is just basically speculation based on the leaked image, no different than people have been doing here. There's nothing to say anything is under that CPU.
If it was a traditional xx90 product, they'd lay out the GPUs on the front side like they traditionally do. Cooling and trace routing for that backside GPU would be a nightmare, for no good reason. You'd almost need a 4 slot card with coolers above and below, or a 3 slot card with hybrid or liquid cooling.
 
Last edited:

Krteq

Senior member
May 22, 2015
993
672
136
The FE has the cooler on both sides. Might be four slots.
:D ...nope, it still takes 2 slots

I also saw this speculation elsewhere. What if there really is 2 die, not a co-processor, but another full die on the back for a traditional xx90 type of product.


"Shots of NVIDIA’s RTX 3090 PCB (or perhaps the 3080) have surfaced on the Bilibili forums (now taken down), showing an interesting design. The VRAM dies are on the back-side of the PCB along with a second processor (co-processor?). Furthermore, the NVLink connector is different from the one on the Turing GPUs."
https://www.hardwaretimes.com/nvidi...eportedly-surfaces-double-sided-with-2-chips/
VRAM is on both sides because those are 8Gbit/1GB per chip -> 24GB = 12 GDDR chips on both sides. And there is NO second GPU/FPGA on back side of PCB, there are just SMDs (and one SMD supercap ;) )
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
:D ...nope, it still takes 2 slots

VRAM is on both sides because those are 8Gbit/1GB per chip -> 24GB = 12 GDDR chips on both sides. And there is NO second GPU/FPGA on back side of PCB, there are just SMDs (and one SMD supercap ;) )
Is 3090 new Titan or is there going to be a 48gb titan?
 

Elfear

Diamond Member
May 30, 2004
7,127
738
126
I get the feeling that if 1080Ti people want to stay in a similar price bracket, they may be stuck with the 3060 cards or maybe a lower model 3070 for about a 15% performance uplift. I'd think that after Turing, Nvidia would want to offer people an upgrade. I can't imagine them leaving so many people with a 1080Ti without a viable upgrade path yet again. Does it make sense for them to do that? I honestly don't know. If the only impressive upgrade comes in at $1000 or higher, then they can't be interested in making a sale to their traditional enthusiast crowd. They would clearly be moving on to a different customer altogether. It just seems odd to me. Even if the 2080Ti had been $700, I would have past it up. It didn't bring enough performance to the table over what I already had. If I'm stuck with a 3070Ti at $800 that is basically on par with a 2080Ti, then I'll be skipping 2 entire generations in a row, which I don't recall ever doing before. I can't be alone here. Seems very odd.

I'm in the same boat as you. I'd have a really hard time spending $800 on a new GPU for a 35-40% performance improvement. If 50-60% more performance is going to cost me $1,200, I'll just carry on with my 1080Ti.
 

jpiniero

Lifer
Oct 1, 2010
15,223
5,768
136
:D ...nope, it still takes 2 slots

VRAM is on both sides because those are 8Gbit/1GB per chip -> 24GB = 12 GDDR chips on both sides. And there is NO second GPU/FPGA on back side of PCB, there are just SMDs (and one SMD supercap ;) )

Ian said the power draw of GDDR6X is 25% more at the higher speed. 24 chips would be a ton of power to the point where there might not be room.
 

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
Ian said the power draw of GDDR6X is 25% more at the higher speed. 24 chips would be a ton of power to the point where there might not be room.
Interesting. It basically means that memory subsystem alone draws 60W per 12 memory chips.

If those GPUs are having 24 memory chips its means 120W in power draw from memory subsystem, alone.

o_O
 

Krteq

Senior member
May 22, 2015
993
672
136
Ian said the power draw of GDDR6X is 25% more at the higher speed. 24 chips would be a ton of power to the point where there might not be room.
So, check that leaked Colorful PCB picture again. There is 12 GDDR6(X) chips on back side + another 12 on front side... or do you thing that nV will place memory chips on back side only with consumer Ampere cards?
 

jpiniero

Lifer
Oct 1, 2010
15,223
5,768
136
So, check that leaked Colorful PCB picture again. There is 12 GDDR6(X) chips on back side + another 12 on front side... or do you thing that nV will place memory chips on back side only with consumer Ampere cards?

I guess I should say that it's not impossible they do 24 GB, just that something would have to give. Perhaps the 24 GB models would have better binned chips.

It's been awhile since nVidia offered a memory choice at the high end but perhaps this is the case.
 

Tup3x

Golden Member
Dec 31, 2016
1,086
1,084
136
The 3090 appears to be replacing the Titan, which currently sells for $3000. So you could look at it as a 33% price cut :p
I can imagine how Mr. Leather Jacket preaches how awesome development that is. "Titan for $1000 less than before."

That being said I kinda doubt that it will cost that much. 1399 € would be my bet. (But in practice... and custom cards....)
 
Last edited:
  • Haha
Reactions: Stuka87 and Konan

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
Interesting. It basically means that memory subsystem alone draws 60W per 12 memory chips.

If those GPUs are having 24 memory chips its means 120W in power draw from memory subsystem, alone.

o_O
Assuming 300-350W total, is 120W for memory so bad? 35-40%? I see little outrageous here.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Interesting. It basically means that memory subsystem alone draws 60W per 12 memory chips.

If those GPUs are having 24 memory chips its means 120W in power draw from memory subsystem, alone.

o_O

I wonder what the power difference would be with slower memory but a 512 bit bus.
 

MrTeal

Diamond Member
Dec 7, 2003
3,614
1,816
136
Memory power scaling doesn't work like that, just having twice as much VRAM won't double power usage when you're keeping bandwidth constant. You're still running 19Gbps on a 384-bit bus. It'll use more power than a 11/12GB card, but not twice as much.
 
  • Like
Reactions: DXDiag and Konan