• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 186 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mopetar

Diamond Member
Jan 31, 2011
4,850
1,141
136
Meanwhile, the 4K/Ultra specs have taken a major hit, upping the CPU requirements from Intel Core i7 9700K to i9 9900K and the GPU requirements from NVIDIA's Turing-based GeForce RTX 2080Ti to the brand new, almost-top-of-the-line Ampere-based GeForce RTX 3080 graphics card. Additionally, Ubisoft recommends setting DLSS to 'Performance Mode', which means Watch Dogs Legion will be effectively upscaled from 1080p to 2160p.
That's with RT turned on though. If you don't care about it, the 9700K is still apparently good enough, as is the 2080 Ti.

I think it just shows that ray tracing is still in its infancy and that it will take another decade before it's matured enough to the point it can be used in mainstream cards and titles without requiring workarounds like DLSS to mitigate the performance hits.
 

guidryp

Senior member
Apr 3, 2006
444
292
136
TechPowerUp's opinion injected into this news report is pretty top-tier cancer.

A fluent Chinese reader would be helpful. I read the original source with machine translation, and the phrase used was "It has also been reported", which may be a restatement of previous reports, and not an affirmation.
Digitimes (original source) itself has an extremely spotty record.

There really is no logic to this rumor.

NVidia didn't blindly choose Samsung. They knew what they were getting into to. They made an extremely considered business decision based of some combination of performance and cost, to build these parts on Samsung process, with a full understanding of the outcomes of using either process.

Once that decision is made, and you are shipping parts, it would be keystone cops level incompetence to spend tens of millions dollars moving the designs to another fab. Some may like to believe that incompetence, but there is scant evidence that it exists.

There won't be a quick respin of these parts on a process they already decided against.
 

GodisanAtheist

Platinum Member
Nov 16, 2006
2,418
846
136
Digitimes (original source) itself has an extremely spotty record.

There really is no logic to this rumor.

NVidia didn't blindly choose Samsung. They knew what they were getting into to. They made an extremely considered business decision based of some combination of performance and cost, to build these parts on Samsung process, with a full understanding of the outcomes of using either process.

Once that decision is made, and you are shipping parts, it would be keystone cops level incompetence to spend tens of millions dollars moving the designs to another fab. Some may like to believe that incompetence, but there is scant evidence that it exists.

There won't be a quick respin of these parts on a process they already decided against.

- I see your reasoning, but to play devil's advocate, NV likely decided against the process (N7) for reasons other than engineering concerns. A lot of hearsay around NV's move to Samsung was that they were getting an extremely sweet deal on the cost of wafers and the per chip cost was incredible.

Its possible Samsung, in a quest to land a AAA tier silicon design firm at their foundry, made a lot of promises about volume production and scaling up design goals that they ultimately missed and for NV that meant launching on 8nm or not launching at all. Looking at the pickle Intel has found itself in with 10nm, I wouldn't immediately discount roadmap/production issues showing up much later in the game than anticipated.

NV could be in a position at the moment where Samsung is costing them money not on a wafer/per die level, but as an opportunity cost for literally not being able to stock the channel with enough functional dies to meet demand. When AMD launches, even if they're not perfectly competitive with NV, gamers may end up going with their second choice because their first is nowhere to be seen.

If that cost is high enough, I can see NV porting their dies over to TSMC. They've already worked with N7 on their top tier Ampere part, which must on some level reduce the engineering challenge of porting other parts over.

Who knows if the truth lies somewhere in between, and NV ends up dual sourcing N7 and 8nm parts to meet demand and the buyer is just playing the silicon lottery on what they get.
 

guidryp

Senior member
Apr 3, 2006
444
292
136
Its possible Samsung, in a quest to land a AAA tier silicon design firm at their foundry, made a lot of promises about volume production and scaling up design goals that they ultimately missed and for NV that meant launching on 8nm or not launching at all. Looking at the pickle Intel has found itself in with 10nm, I wouldn't immediately discount roadmap/production issues showing up much later in the game than anticipated.
Sure, it's "possible", but it's not very likely. It's a lot more likely that the rumor is just nonsense as most are.

Jensen recently stated yields and production were very good on 3000 cards. As CEO, he really can't lie about this stuff without risking lawsuits, and/or SEC investigations.

Edit:
Right after Turing launched. People were touting a story that NVidia would be releasing TSMC 7nm GPUs in 2019 based on info from Digitimes:
 
Last edited:

GodisanAtheist

Platinum Member
Nov 16, 2006
2,418
846
136
Sure, it's "possible", but it's not very likely. It's a lot more likely that the rumor is just nonsense as most are.

Jensen recently stated yields and production were very good on 3000 cards. As CEO, he really can't lie about this stuff without risking lawsuits, and/or SEC investigations.

Edit:
Right after Turing launched. People were touting a story that NVidia would be releasing TSMC 7nm GPUs in 2019 based on info from Digitimes:
- The rumor is almost certainly non-sense, NV would just accelerate into Hopper or whatever their next gen release is rather than piddle about with Ampere.

But to the bolded statement above, I've got a healthy level of skepticism of any qualitative statements like "Very good". Its like saying you look "very good" compared to a decaying pile of elephant dung, but that's still not really a good thing :p
 
  • Like
Reactions: uzzi38 and Mopetar

CakeMonster

Senior member
Nov 22, 2012
955
68
91
I wonder what kind of performance uplift TSMC's 7nm would bring. 10-15% better clocks with lower power consumption? Assuming 20GB versions, I'd absolutely be willing to hold off for that, I wonder how far out we're talking though.
All bets are off for 12+ months in the future, and history would suggest new models then. 2019 is the anomaly that went without a new top card, which had arrived every year with 20-30% performance increase before then.

But we could also see a 2 year wait like with Turing, with only minor releases down the stack like bumping VRAM but little else.

However, I am quite puzzled by everyone who appear so convinced about every rumor of better NV cards just after 3xxx release, or even before, that they are willing to wait for something they have no idea if is even real. I needed the performance now, and I'm currently playing games on a 3xxx card in proper performance that I didn't want to hold off on for 12 months or possibly more. I'm not telling anyone to buy a card, but sometimes, with incomplete info, the best bet is what you know will get you your target performance right now.
 
  • Like
Reactions: VirtualLarry

Midwayman

Diamond Member
Jan 28, 2000
5,614
248
106
However, I am quite puzzled by everyone who appear so convinced about every rumor of better NV cards just after 3xxx release, or even before, that they are willing to wait for something they have no idea if is even real.
Some people just can't accept team green might not be the leader this gen.
 
  • Like
Reactions: Martimus

sze5003

Lifer
Aug 18, 2012
13,025
251
126
All bets are off for 12+ months in the future, and history would suggest new models then. 2019 is the anomaly that went without a new top card, which had arrived every year with 20-30% performance increase before then.

But we could also see a 2 year wait like with Turing, with only minor releases down the stack like bumping VRAM but little else.

However, I am quite puzzled by everyone who appear so convinced about every rumor of better NV cards just after 3xxx release, or even before, that they are willing to wait for something they have no idea if is even real. I needed the performance now, and I'm currently playing games on a 3xxx card in proper performance that I didn't want to hold off on for 12 months or possibly more. I'm not telling anyone to buy a card, but sometimes, with incomplete info, the best bet is what you know will get you your target performance right now.
Sure, buy what you want now if you aren't happy with your performance. Most people though I'm willing to bet have a card that they are happy with for the time being.

I loved the ability to upgrade safely every 1.5 years or so..one component or two here and there but lately the advice has always been wait to see what happens until the full picture is revealed.

I just want to see what AMD comes out with and what response Nvidia if any, will provide. Nothing wrong with waiting for that is there ? Chances are if you have a gsync monitor you have to go with Nvidia but like I said it's easier to make a decision when you have a better picture.
 

ozzy702

Golden Member
Nov 1, 2011
1,037
427
136
Some people just can't accept team green might not be the leader this gen.
Some people have higher end G-sync monitors and don't want to buy a new monitor until high hz quality 4k+ monitors are more reasonably priced. Several people in this thread, including myself fall into this category. I was considering AMD until I remembered that fact, shook my fist at the sky, yelled "damn you NVIDIA" and decided to sit tight and see what comes. I have zero brand loyalty and have purchased far more AMD GPUs than NVIDIA over the years. It's not as simple as fanboism. I may still pickup a 3080 if they ever become available, but if waiting for spring gets me more memory and maybe even a faster GPU...
 

guidryp

Senior member
Apr 3, 2006
444
292
136
All bets are off for 12+ months in the future, and history would suggest new models then. 2019 is the anomaly that went without a new top card, which had arrived every year with 20-30% performance increase before then.

But we could also see a 2 year wait like with Turing, with only minor releases down the stack like bumping VRAM but little else.
Pascal lasted more than 2 years. 2 years is the norm more these days.
 

Ajay

Diamond Member
Jan 8, 2001
7,408
2,590
136
Sure, it's "possible", but it's not very likely. It's a lot more likely that the rumor is just nonsense as most are.

Jensen recently stated yields and production were very good on 3000 cards. As CEO, he really can't lie about this stuff without risking lawsuits, and/or SEC investigations.

Edit:
Right after Turing launched. People were touting a story that NVidia would be releasing TSMC 7nm GPUs in 2019 based on info from Digitimes:
LOL, and that article was based on a Digitimes 'report' as well. Personally, I think Nvidia would be better off trying to accelerate their work on Hopper @ 5nm.
Switching to 7nm TSMC would be almost as expensive as a new design (although, 5nm wafers are going to be more expensive).

The only possible 'shortcut' move would be if SS 7N EUV uses design rules compatible with SS 8N. But that decision would have to have been made already and I think 7N yields are still too low for NV.
 

Kuiva maa

Member
May 1, 2014
141
120
116
I wonder what kind of performance uplift TSMC's 7nm would bring. 10-15% better clocks with lower power consumption? Assuming 20GB versions, I'd absolutely be willing to hold off for that, I wonder how far out we're talking though.
Whatever it is it will be good enough to referesh the whole range so I expect nvidia (if the timeline allows it) to bring these to market around September next year. May name them 3000 Super or even 4000.
 

Midwayman

Diamond Member
Jan 28, 2000
5,614
248
106
Some people have higher end G-sync monitors and don't want to buy a new monitor until high hz quality 4k+ monitors are more reasonably priced. Several people in this thread, including myself fall into this category. I was considering AMD until I remembered that fact, shook my fist at the sky, yelled "damn you NVIDIA" and decided to sit tight and see what comes. I have zero brand loyalty and have purchased far more AMD GPUs than NVIDIA over the years. It's not as simple as fanboism. I may still pickup a 3080 if they ever become available, but if waiting for spring gets me more memory and maybe even a faster GPU...
Sure, there are reasons to choose Nvidia. I mean CUDA alone is a big reason. I'm not trying to bag on nvidia by any means. That was about people having what seems to be an unreasonable assumption that Nvidia has some trump card they are waiting to play until after AMD releases their cards. Who knows what will happen 6 months out, all we know is there appears to be zero headroom to respond right now. Dropping prices looks to be it. I'm just happy there is competition no matter which card people ultimately pick.
 

PhoBoChai

Member
Oct 10, 2017
64
158
76
- The rumor is almost certainly non-sense, NV would just accelerate into Hopper or whatever their next gen release is rather than piddle about with Ampere.

But to the bolded statement above, I've got a healthy level of skepticism of any qualitative statements like "Very good". Its like saying you look "very good" compared to a decaying pile of elephant dung, but that's still not really a good thing :p
What if Hopper isn't for gaming?
 

Stuka87

Diamond Member
Dec 10, 2010
5,115
848
126
Some people have higher end G-sync monitors and don't want to buy a new monitor until high hz quality 4k+ monitors are more reasonably priced. Several people in this thread, including myself fall into this category. I was considering AMD until I remembered that fact, shook my fist at the sky, yelled "damn you NVIDIA" and decided to sit tight and see what comes. I have zero brand loyalty and have purchased far more AMD GPUs than NVIDIA over the years. It's not as simple as fanboism. I may still pickup a 3080 if they ever become available, but if waiting for spring gets me more memory and maybe even a faster GPU...
G-Sync really only comes into play if you are falling below whatever your refresh is. So an AMD card would still function, just without the better frame handling. Unless the display is actually a g-sync labeled FreeSync display, then you are still good.

But yeah, vendor lock-in sucks for the exact reasons you note.
 

ozzy702

Golden Member
Nov 1, 2011
1,037
427
136
G-Sync really only comes into play if you are falling below whatever your refresh is. So an AMD card would still function, just without the better frame handling. Unless the display is actually a g-sync labeled FreeSync display, then you are still good.

But yeah, vendor lock-in sucks for the exact reasons you note.
Yeah, 4yo 1440p 144hz IPS G-Sync monitor, and yeah, I suppose if the AMD GPU had enough grunt, and I adjusted settings properly I may not miss adaptive sync, but not sure I want to deal with it.

Thankfully the market has moved past the proprietary nonsense and I'm excited to hopefully see AMD back into the game.
 

gorobei

Diamond Member
Jan 7, 2007
3,089
236
106
However, I am quite puzzled by everyone who appear so convinced about every rumor of better NV cards just after 3xxx release, or even before, that they are willing to wait for something they have no idea if is even real. I needed the performance now, and I'm currently playing games on a 3xxx card in proper performance that I didn't want to hold off on for 12 months or possibly more. I'm not telling anyone to buy a card, but sometimes, with incomplete info, the best bet is what you know will get you your target performance right now.
Sure, there are reasons to choose Nvidia. I mean CUDA alone is a big reason. I'm not trying to bag on nvidia by any means. That was about people having what seems to be an unreasonable assumption that Nvidia has some trump card they are waiting to play until after AMD releases their cards. Who knows what will happen 6 months out, all we know is there appears to be zero headroom to respond right now. Dropping prices looks to be it. I'm just happy there is competition no matter which card people ultimately pick.
Daniel Nenni of semiwiki was on the Mooreslawisdead podcast a few weeks ago. he says that due to the nature of tsmc advance node contracts, any project members that work on the node cant work on the same design with another fab. so the ampere team that worked with samsung cant move over to tsmc 7nm. the datacenter ampere design team is probably on tsmc given the rumors that were going around, but nv would have to assemble a new team to fab consumer ampere at tsmc. so new team, new masks, and no samsung massive discount on wafers would make the costs of a tsmc ampere too expensive.
the samsung node is still rough so there should be some performance improvements coming, but nothing massive.
 

ModEl4

Member
Oct 14, 2019
37
20
41
Another Digitimes rumor?🙄
I want to start a new one: 3080 20GB will be a 1730MHz 288TC part with 16Gbps GDDR6 at $799🤪
 

n0x1ous

Platinum Member
Sep 9, 2010
2,518
174
106
Really no need for HDR1000 for a display you sit 2-3ft from. That level of brightness is really only required when using a TV and you are 10-15ft back from it.
need? perhaps not. But I can attest that its awesome from experience.
 

ASK THE COMMUNITY