• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 96 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 
I'm a bit torn, I love that the products are pushing the envelope, but if the rumors are true about the 3080 being 20-30% on the 2080Ti, and the 3090 being 50-60% on the 2080Ti, I think I'd have preferred to buy the non existent 3090 'lite' that was normal sized and averaged 45-50% on the 2080Ti.
 
I wonder if the 3080 will have more reasonable power/heat requirements. That card seems more interesting at this point. My 850W power supply doesn't even have enough connectors for both this new plug and the extra EPS 4-pin CPU plug on current motherboards, although the extra CPU plug is not really necessary and can be removed.
 
3080 uses the leaked two slot cooler. It will use less power. And going by the Timespy Extreme score it should be >30% faster than a 2080TI and >70% faster than a 2080.
 
I'm a bit torn, I love that the products are pushing the envelope, but if the rumors are true about the 3080 being 20-30% on the 2080Ti, and the 3090 being 50-60% on the 2080Ti, I think I'd have preferred to buy the non existent 3090 'lite' that was normal sized and averaged 45-50% on the 2080Ti.

I think I’m with you on this one. There is definitely a pricing segment difference between the top two cards that have been announced so far. To me it’s obvious there’s a reservation that is booked in between the two for a release but it will be dependent on what happens with big Navi (apparently one of the new competitors cards will about equal a 2080ti and another beat it just a little (2xSKUs))
If NV need to use that space it’s probably a new calendar year launch.
 
I wonder if the 3080 will have more reasonable power/heat requirements. That card seems more interesting at this point. My 850W power supply doesn't even have enough connectors for both this new plug and the extra EPS 4-pin CPU plug on current motherboards, although the extra CPU plug is not really necessary and can be removed.
According to rumors, which so far turned out to be true, RTX 3080 is going to be around 320W TDP.
 
At this point we don't just have leaking but outright spurting of info. Makes me think NV's marketing team is softening the landscape before their ultimate assault on the 1st.

I suspect a lot of the new info we've gotten over the last few days (especially the engineering sample 3090) will show up in a more refined manner at a not $2000 price tag so it looks like a better product/steal.

12 pin power connector, get it out there and get people used to the idea so it doesn't come as a huge shock and go meme on day one.
 
The nice thing about how nvidia is approaching it is that they're using a standard 12 pin micro-fit and not a custom one like the PCIe power connectors. If you want to roll your own cables, the housing and pins would be $2 even in tiny quantity.
 

The 12 pin is not bad, looks like PSU makers are onboard already.
The other question is what will Ampere use all that power for.

Sigh this is totally not needed..
What is wrong with dual 8pins?
Or are you telling me this will have dual 12p.

This just seems like Nvidia partnered with power supply companies so they could also make money.
Its bad enough power supplies are hard as hell to find in today's market, add a special part you need to get, just makes things even more of a hassle.
 
Sigh this is totally not needed..
What is wrong with dual 8pins?
Or are you telling me this will have dual 12p.

This just seems like Nvidia partnered with power supply companies so they could also make money.
Its bad enough power supplies are hard as hell to find in today's market, add a special part you need to get, just makes things even more of a hassle.
The whole 6+2 pin cable mess is a PITA though. If you set aside the whole pain of the transition, having one 12pin solid connector will be so much nicer for a high power GPU than two 8 pins, especially if you're trying to keep the 6+2 pins together as you bend it around and try and plug it in to a tight space.

The ATX spec for 6 and 8 pin PCIe connectors is just a legacy waste of space and should have been revisited years ago.
 
I'm a bit torn, I love that the products are pushing the envelope, but if the rumors are true about the 3080 being 20-30% on the 2080Ti, and the 3090 being 50-60% on the 2080Ti, I think I'd have preferred to buy the non existent 3090 'lite' that was normal sized and averaged 45-50% on the 2080Ti.

Assuming you are ok with a 400W GPU and not running a SFF PC...
 
Seems like some future planning and legacy de-tanglement. The 12-pin interface can replace the previous 6+6 pins, 6+8 pins, 8+8 pins, 6+6+8 pins, 6+8+8 pins, etc. and any combination of. Would totally simplify wiring and save space in your case.
 
Sigh this is totally not needed..
What is wrong with dual 8pins?
Or are you telling me this will have dual 12p.

This just seems like Nvidia partnered with power supply companies so they could also make money.
Its bad enough power supplies are hard as hell to find in today's market, add a special part you need to get, just makes things even more of a hassle.

This is partially being driven by the new 12VO power supply standard. And I for one would much prefer to just have a one size fits all GPU connector. No single 6pin, dual 6 pin, 6+8, single 8, dual 8, not to mention that currently you have to worry about what rail each connectors is coming from for some lower end PSU's.

With this connector you just plug it in, and then you only need to worry about the wattage of the PSU. If the card requires a 650W, make sure you have that and you are good.
 
Last edited:
Who cares guys?

Its only a connector, after all. Nothing really important.
Well it would suck if you’re after an FE card and you don’t have a modular PSU.

At this point for me personally I would only be interested in an FE card on the outcome of the cooling solution. Other than that I won’t really want to spend $100 extra for FE tax.
 
Sigh this is totally not needed..
What is wrong with dual 8pins?
Or are you telling me this will have dual 12p.

This just seems like Nvidia partnered with power supply companies so they could also make money.
Its bad enough power supplies are hard as hell to find in today's market, add a special part you need to get, just makes things even more of a hassle.
The wrong assumption is that this equals (2)x 8 pin in power delivery. This connector is meant to connect directly to the PSU and can deliver more than the (2)x 8 pin aggregate. The fact that they recommend an 850W or greater PSU is a clue.

It's a good thing that we have mostly single rail 12V PSUs now as I can see the damage caused by using the wrong 8 pin sockets in the PSU, causing overloads on one of the rails in dual rail PSUs.
 
If there's an adapter included (which I think Nvidia will do) it doesn't really matter if your PSU is modular or not.
It's not what you think. If it was, the 8 pin ends would be female. As can be seen in the product photos, it's male, meaning it's meant to be inserted directly into the PSU.
 
The current modular PSUs typically have 1x8pin at the PSU going to 2x8pin PCIE at the video card. This nvidia cable has 2x8pin at the PSU, suggesting it's comparable to 4x8pin PCIE. Some PSUs even support the latter but not 2x8pin for the CPU at the same time.
 
It's not what you think. If it was, the 8 pin ends would be female. As can be seen in the product photos, it's male, meaning it's meant to be inserted directly into the PSU.
I was talking about an adapter for existing PSU's, not that one from SeaSonic which is a seperate cable and not an adapter. Or let's call it an extender rather than adapter which I think will be included.
 
Well it would suck if you’re after an FE card and you don’t have a modular PSU.

At this point for me personally I would only be interested in an FE card on the outcome of the cooling solution. Other than that I won’t really want to spend $100 extra for FE tax.

It would be hard to imagine people who would be shopping for this card wouldn't have a modular PSU.
 
They did the BIG Huge Ampere to satisfy his ego. Depending on the outcome he can still at least claim his is bigger....
The Founder's Edition boxes even contain a little black leather jacket to dress up your graphics card in!

With this connector you just plug it in, and then you only need to worry about the wattage of the PSU.
What you say makes a lot of sense, the current system is a bit of a mess. Like, who enjoys fiddling with those finicky 6+2-pin GPU power connectors, and trying to insert two separate connectors at the same time? Ugh. (Also: writing that sentence without it sounding too sexual in nature... lol)

That said though, it should be a standardized thing, not a Nvidia exclusive. Because we've had enough proprietary Nvidia crap by now. I still remember Huang whining about the proprietary nature of Glide, back when 3dfx was still the big dog in the 3D graphics card market; the taste of NV hypocrisy is beyond bitter at this point.
 
Back
Top