• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 117 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 
Still feels a bit lacking though with a brand new flagship extremely expensive chip. Yeah, HDMI 2.1 will probably carry over and support the same variable frame rates technologies etc, but IIRC DP 2.0 has higher bandwidth.

I do have a feeling that the monitor producers will unleash a torrent of new models once DP 2.0 and HDMI 2.1 is ready, because they're stuck in a loop of meaninglessly refreshing 1440p 144hz monitors now that is quickly drying up excitement. And in the lifetime of the 3xxx chips, 4K to 8K and probably some creative options in between along with different aspect ratios will probably be introduced. And being stuck on DP 1.4a for that is still disappointing.
 
Yes, DP2.0 would be good. But unlike DP1.4 and HDMI2.0b with 1.4a (DSC) and HDMI 2.1 most use cases are covered. And i dont really think that 4K@120Hz with HDR is a DP problem to solve...
 
Not if nVidia decided to call SS8LPU "7 nm".
I'd say that's unlikely unless there was some merit to it. However didn't Huang say TSMC will be producing all the 7nm orders? They fabbed their A100 their, so i would imagine that is also the 7nm process they are using for GA.

Using TSMC could also help explain partly the price increase, being such a large chip.

Sent from my SM-N975F using Tapatalk
 
I'd say that's unlikely unless there was some merit to it. However didn't Huang say TSMC will be producing all the 7nm orders?

He said most. Which would still be accurate at the time he spoke if marketing decided later to call the process "7 nm".

Edit: I would expect it to be even denser & use much less power if it was on an actual 7 nm node from either foundry.
 
Last edited:
He said most. Which would still be accurate at the time he spoke if marketing decided later to call the process "7 nm".
Hmm ok well I guess another possibility could be the helped developed a 7nm non EUV process with Samsung, like they did with the 12nm process with TSMC. So a further refinement of SS8

Sent from my SM-N975F using Tapatalk
 
Still feels a bit lacking though with a brand new flagship extremely expensive chip. Yeah, HDMI 2.1 will probably carry over and support the same variable frame rates technologies etc, but IIRC DP 2.0 has higher bandwidth.

I do have a feeling that the monitor producers will unleash a torrent of new models once DP 2.0 and HDMI 2.1 is ready, because they're stuck in a loop of meaninglessly refreshing 1440p 144hz monitors now that is quickly drying up excitement. And in the lifetime of the 3xxx chips, 4K to 8K and probably some creative options in between along with different aspect ratios will probably be introduced. And being stuck on DP 1.4a for that is still disappointing.

The LG OLED TVs already support HDMI 2.1 with 4K/120hz/VRR. Best gaming monitors out there in my opinion, although mine is a slightly older model that doesn't have HDMI 2.1. I want to upgrade both the TV and video card mainly for this. I think some of the other TVs have it now too.
 
He said most. Which would still be accurate at the time he spoke if marketing decided later to call the process "7 nm".

Edit: I would expect it to be even denser & use much less power if it was on an actual 7 nm node from either foundry.

If rumored 34.5B transistor count is true for 3090, then it would be far denser than Navi 10, and likelihood of using any 8nm variant would be nil.
 
If rumored 34.5B transistor count is true for 3090, then it would be far denser than Navi 10, and likelihood of using any 8nm variant would be nil.
If ~627mm² and 34.5B are true, then we have ~55MTr/mm² for GA102 and that is not a problem for 8nm Samsung.

samsung-density-14nm-10nm-8nm.png

 
If ~627mm² and 34.5B are true, then we have ~55MTr/mm² for GA102 and that is not a problem for 8nm Samsung.

samsung-density-14nm-10nm-8nm.png


I doubt it. It would be only suitable for smartphone Soc, not 600mm+ GPUs. High end GPUs doesn't reach close to the density limit of the node it uses.
 
If rumored 34.5B transistor count is true for 3090, then it would be far denser than Navi 10, and likelihood of using any 8nm variant would be nil.

In the end only the performance matters for those shopping the 3090's. It could use 1000w and sizzle bacon, fry eggs, etc and they'd justify the purchase decision. They'll buy a bigger case, replace a 6 month old power supply whatever it takes. When you got more dollars then (common) cents silly nodes names don't matter.

Most mere mortals shop the $3-$500 price point anyways.
 
Again so much denial in this thread it's almost funny. Whether it's 7nm TSMC or Samsung, the fact that matters is they are within a striking distance from each other.
 
Again so much denial in this thread it's almost funny. Whether it's 7nm TSMC or Samsung, the fact that matters is they are within a striking distance from each other.

We will find out in less than 3 days. But fake 7nm theory is the funniest one I saw so far.
 
I wonder if this is "real" HDMI 2.1? I mean wouldn't Nvidia have to support VRR over HDMI? Does that mean my old 4k60 freesync monitors should work over HDMI with Ampere as well as over DP? I hate using DP cables since they are more expensive and in my experience with LG and Samsung monitors have to be of a higher gauge for the same run as HDMI for 4K60.

If I could switch from my DP port to HDMI on my LG 32UD99 and still keep gsync compatible that would help alot since I can finally run my PC from my equipment rack.
 
I wonder if this is "real" HDMI 2.1? I mean wouldn't Nvidia have to support VRR over HDMI? Does that mean my old 4k60 freesync monitors should work over HDMI with Ampere as well as over DP? I hate using DP cables since they are more expensive and in my experience with LG and Samsung monitors have to be of a higher gauge for the same run as HDMI for 4K60.

If I could switch from my DP port to HDMI on my LG 32UD99 and still keep gsync compatible that would help alot since I can finally run my PC from my equipment rack.
HDMI Freesync is AMD's own stuff. It obviously will support HDMI VRR standard. Actually Turing supports it already.
 
Last edited:
HDMI Freesync is AMD's own stuff. It obviously will support HDMI VRR standard. Actually Turing support it already.

Turing has only enabled it on LG C9 displays. There are TVs with HDMI 2.1 VRR chipsets (including other LGs) that VRR does not work with Turing. Right now Nvidia is only enabling it when they want to, its not universal.
 
Last edited:
I don't think any of the 7 or 8 models support it. My E8 does not. The 9 models (C9 and up) do support it, and probably also the X ones from this year.
 
Oh I see your confusion. Can you find me more than a dozen people who got it for less than $500 in the first 6 months of the card's existence?
I did... € price with 24% VAT so not directly comparable but it was less than 500 €. It didn't cost that much back then. Mining fever did rise the price across the board later though.
 
Back
Top