Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 94 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

marcUK2

Member
Sep 23, 2019
81
68
91
Ive just realized what is going on with this strange oversized 3090 and odd cooling setup..

There is an Ampere chip on BOTH sides of the card in some kind of DUAL GPU SLI/NVLink one card setup. :)

Otherwise, it suggests Nvidia is running scared of RDNA2 and has OCed the bejesus out of this.

Hoping for 1st scenario, I have cash waiting for it!
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
Ive just realized what is going on with this strange oversized 3090 and odd cooling setup..

Nope....Probably Jen-Hsun Huang's e-peen couldn't hang with all the " Big Navi " talk spreading all over the internet. They did the BIG Huge Ampere to satisfy his ego. Depending on the outcome he can still at least claim his is bigger....
 
  • Haha
Reactions: Saylick

Krteq

Senior member
May 22, 2015
993
672
136
Ive just realized what is going on with this strange oversized 3090 and odd cooling setup..

There is an Ampere chip on BOTH sides of the card in some kind of DUAL GPU SLI/NVLink one card setup. :)

Otherwise, it suggests Nvidia is running scared of RDNA2 and has OCed the bejesus out of this.

Hoping for 1st scenario, I have cash waiting for it!
There is no dual GPU card in Ampere customer lineup
 
  • Like
Reactions: lobz

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
What if both, caused by Nvidia's huge mistake: underestimating AMD's capabilities of silicon physical design? ;)

Sure, "underestimated". Makes no sense. Even a GP107 produced on Samsung's 14nm LPP is as efficient as Navi.
And yet we should believe that nVidia cant produce something better than Pascal.

Ampere will have >6 billion transistors more than Navi. It will be hard for AMD to even come close to GA102. Competition is the Chip below it.
 

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
Sure, "underestimated". Makes no sense. Even a GP107 produced on Samsung's 14nm LPP is as efficient as Navi.
And yet we should believe that nVidia cant produce something better than Pascal.

Ampere will have >6 billion transistors more than Navi. It will be hard for AMD to even come close to GA102. Competition is the Chip below it.
If your post is indication of how Nvidia engineers thought - yes, it makes perfect sense, they underestimated AMD's capabilities, and talent ;).

Arrogance is the fastest way to make mistakes ;).
 

CakeMonster

Golden Member
Nov 22, 2012
1,502
659
136
I'm a bit torn, I love that the products are pushing the envelope, but if the rumors are true about the 3080 being 20-30% on the 2080Ti, and the 3090 being 50-60% on the 2080Ti, I think I'd have preferred to buy the non existent 3090 'lite' that was normal sized and averaged 45-50% on the 2080Ti.
 

CP5670

Diamond Member
Jun 24, 2004
5,539
613
126
I wonder if the 3080 will have more reasonable power/heat requirements. That card seems more interesting at this point. My 850W power supply doesn't even have enough connectors for both this new plug and the extra EPS 4-pin CPU plug on current motherboards, although the extra CPU plug is not really necessary and can be removed.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
3080 uses the leaked two slot cooler. It will use less power. And going by the Timespy Extreme score it should be >30% faster than a 2080TI and >70% faster than a 2080.
 

Konan

Senior member
Jul 28, 2017
360
291
106
I'm a bit torn, I love that the products are pushing the envelope, but if the rumors are true about the 3080 being 20-30% on the 2080Ti, and the 3090 being 50-60% on the 2080Ti, I think I'd have preferred to buy the non existent 3090 'lite' that was normal sized and averaged 45-50% on the 2080Ti.

I think I’m with you on this one. There is definitely a pricing segment difference between the top two cards that have been announced so far. To me it’s obvious there’s a reservation that is booked in between the two for a release but it will be dependent on what happens with big Navi (apparently one of the new competitors cards will about equal a 2080ti and another beat it just a little (2xSKUs))
If NV need to use that space it’s probably a new calendar year launch.
 

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
I wonder if the 3080 will have more reasonable power/heat requirements. That card seems more interesting at this point. My 850W power supply doesn't even have enough connectors for both this new plug and the extra EPS 4-pin CPU plug on current motherboards, although the extra CPU plug is not really necessary and can be removed.
According to rumors, which so far turned out to be true, RTX 3080 is going to be around 320W TDP.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,166
7,666
136
At this point we don't just have leaking but outright spurting of info. Makes me think NV's marketing team is softening the landscape before their ultimate assault on the 1st.

I suspect a lot of the new info we've gotten over the last few days (especially the engineering sample 3090) will show up in a more refined manner at a not $2000 price tag so it looks like a better product/steal.

12 pin power connector, get it out there and get people used to the idea so it doesn't come as a huge shock and go meme on day one.
 
  • Like
Reactions: FaaR

MrTeal

Diamond Member
Dec 7, 2003
3,614
1,816
136
The nice thing about how nvidia is approaching it is that they're using a standard 12 pin micro-fit and not a custom one like the PCIe power connectors. If you want to roll your own cables, the housing and pins would be $2 even in tiny quantity.
 
  • Like
Reactions: Konan and xpea

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,894
3,247
126

The 12 pin is not bad, looks like PSU makers are onboard already.
The other question is what will Ampere use all that power for.

Sigh this is totally not needed..
What is wrong with dual 8pins?
Or are you telling me this will have dual 12p.

This just seems like Nvidia partnered with power supply companies so they could also make money.
Its bad enough power supplies are hard as hell to find in today's market, add a special part you need to get, just makes things even more of a hassle.
 
  • Like
Reactions: blckgrffn

MrTeal

Diamond Member
Dec 7, 2003
3,614
1,816
136
Sigh this is totally not needed..
What is wrong with dual 8pins?
Or are you telling me this will have dual 12p.

This just seems like Nvidia partnered with power supply companies so they could also make money.
Its bad enough power supplies are hard as hell to find in today's market, add a special part you need to get, just makes things even more of a hassle.
The whole 6+2 pin cable mess is a PITA though. If you set aside the whole pain of the transition, having one 12pin solid connector will be so much nicer for a high power GPU than two 8 pins, especially if you're trying to keep the 6+2 pins together as you bend it around and try and plug it in to a tight space.

The ATX spec for 6 and 8 pin PCIe connectors is just a legacy waste of space and should have been revisited years ago.
 
  • Like
Reactions: ozzy702 and Tup3x

alcoholbob

Diamond Member
May 24, 2005
6,317
366
126
I'm a bit torn, I love that the products are pushing the envelope, but if the rumors are true about the 3080 being 20-30% on the 2080Ti, and the 3090 being 50-60% on the 2080Ti, I think I'd have preferred to buy the non existent 3090 'lite' that was normal sized and averaged 45-50% on the 2080Ti.

Assuming you are ok with a 400W GPU and not running a SFF PC...
 

Konan

Senior member
Jul 28, 2017
360
291
106
Seems like some future planning and legacy de-tanglement. The 12-pin interface can replace the previous 6+6 pins, 6+8 pins, 8+8 pins, 6+6+8 pins, 6+8+8 pins, etc. and any combination of. Would totally simplify wiring and save space in your case.
 
  • Like
Reactions: ozzy702

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Sigh this is totally not needed..
What is wrong with dual 8pins?
Or are you telling me this will have dual 12p.

This just seems like Nvidia partnered with power supply companies so they could also make money.
Its bad enough power supplies are hard as hell to find in today's market, add a special part you need to get, just makes things even more of a hassle.

This is partially being driven by the new 12VO power supply standard. And I for one would much prefer to just have a one size fits all GPU connector. No single 6pin, dual 6 pin, 6+8, single 8, dual 8, not to mention that currently you have to worry about what rail each connectors is coming from for some lower end PSU's.

With this connector you just plug it in, and then you only need to worry about the wattage of the PSU. If the card requires a 650W, make sure you have that and you are good.
 
Last edited:

Konan

Senior member
Jul 28, 2017
360
291
106
Who cares guys?

Its only a connector, after all. Nothing really important.
Well it would suck if you’re after an FE card and you don’t have a modular PSU.

At this point for me personally I would only be interested in an FE card on the outcome of the cooling solution. Other than that I won’t really want to spend $100 extra for FE tax.
 

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
Sigh this is totally not needed..
What is wrong with dual 8pins?
Or are you telling me this will have dual 12p.

This just seems like Nvidia partnered with power supply companies so they could also make money.
Its bad enough power supplies are hard as hell to find in today's market, add a special part you need to get, just makes things even more of a hassle.
The wrong assumption is that this equals (2)x 8 pin in power delivery. This connector is meant to connect directly to the PSU and can deliver more than the (2)x 8 pin aggregate. The fact that they recommend an 850W or greater PSU is a clue.

It's a good thing that we have mostly single rail 12V PSUs now as I can see the damage caused by using the wrong 8 pin sockets in the PSU, causing overloads on one of the rails in dual rail PSUs.
 

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
If there's an adapter included (which I think Nvidia will do) it doesn't really matter if your PSU is modular or not.
It's not what you think. If it was, the 8 pin ends would be female. As can be seen in the product photos, it's male, meaning it's meant to be inserted directly into the PSU.