Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 199 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
How could Nvidia have screwed up so many things in so many different areas so badly this launch? This would have been impossible to predict based on past performance

Cheaper (?) node with apparently poor characteristics.
Unbalanced design with too many compute units relative to rest.
Supply side disaster.
Terrible memory options. Too much or too little, choose.

You forgot OEM's not actually being able to test any cards before launch which lead to crashing issues that had to be resolved with a driver change.

But many of these can be attributed to a VERY rushed launch. These cards were going to launch in spring according to everything we know. This rushed launch resulted in stability issues due to drivers, almost no supply, and too little memory (2GB modules aren't yet available).

One of the reasons nVidia's launches normally go so well is they can basically release at their leisure. Except this year, where they have real competition. But its looking like they were just way to early.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Quite a few of those aren't really things they've messed up either. The supply side is, granted, rather 'amusing' but everything else is mostly fine/normal.

They got a normal/large generational improvement - and a worse process leaves more scope for future upticks - and the extra compute in the huge die is entirely rational given how many of them they'll sell for compute.

The memory is frankly a bit over blown and is mostly just a reflection of how things stand absent using a caching solution like AMD have. There could certainly be non trivial drawbacks to that too, we'll see over time and with testing.

AMD always used to compete very well, not a surprise they're managing to do now they've got the funding.
 
  • Like
Reactions: Leeea

Ajay

Lifer
Jan 8, 2001
15,406
7,833
136
As long as gamers still buy their GPUs in record numbers, why should they improve? It's a winning formula for NV to grow profits.
It was a winning formula, but will it continue? I think NV knew that they had a problem once SS 7EUV got botched. They needed to get something out and regroup; trading on their great mind share. Like Intel, they can be on their back foot this generation - so long as the follow up (Hopper) delivers, they won't lose much momentum. It took three years of stagnation before Intel's bottom line started to be affected. No doubt the more nimble culture at NV will adjust quickly. Interesting times for gamers anyway, and little short term downside for NV's financials.
 
  • Like
Reactions: maddie

reb0rn

Senior member
Dec 31, 2009
221
58
101
On heavy memory usage on 3080 and 3090 with gddr6x ETH miners report memory overheat most go over 105C
The product is flowed as they do not use direct cooler to memory but bad quality 2mm thermal pad, and its know what gddr6x use a lot power

with nvidia-smi.exe -q (program files/nvidia/..) you can see memory temp

Nvidia again made bad product as even NV design do not use direct cooler to memory contact! even if they know spec that memory chip use a lot more power!
 
  • Like
Reactions: lightmanek

Saylick

Diamond Member
Sep 10, 2012
3,121
6,280
136
How could Nvidia have screwed up so many things in so many different areas so badly this launch? This would have been impossible to predict based on past performance

Cheaper (?) node with apparently poor characteristics.
Unbalanced design with too many compute units relative to rest.
Supply side disaster.
Terrible memory options. Too much or too little, choose.
I guess Nvidia just isn't used to having this much competition at the high end. AMD threw a wrench into their system with RDNA 2 and now they are scrambling to figure out a revised line-up to stay competitive while also trying to buoy their profit margins. JHH would like another Ferrari but something has to give when AMD is squeezing his plums with a GPU that likely has better yields, is likely easier to supply, has better perf/W, and comes with more RAM. That "something" is AIB profit margins along with their relationship with AIBs, and it's not exactly a wise move since they are screwing over their distributors. As the saying goes, don't crap where you eat.
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
One of the reasons nVidia's launches normally go so well is they can basically release at their leisure. Except this year, where they have real competition. But its looking like they were just way to early.

I think waiting would actual have been better. What's the point of an early launch if you have no inventory to sell? Anyone into a new GPU will soon also see the new AMD ones on the same chart and go from there. It's not like NV will avoid a comparisons with AMDs cards. Being a bit later with a mediocre product is better than pissing of your customer base, the ones that could buy will be annoyed if a refresh with more vram happens and the ones that couldn't buy got pissed due to lack of inventory.
 
Last edited:
  • Like
Reactions: Stuka87 and Leeea

PhoBoChai

Member
Oct 10, 2017
119
389
106
It was a winning formula, but will it continue? I think NV knew that they had a problem once SS 7EUV got botched. They needed to get something out and regroup; trading on their great mind share. Like Intel, they can be on their back foot this generation - so long as the follow up (Hopper) delivers, they won't lose much momentum. It took three years of stagnation before Intel's bottom line started to be affected. No doubt the more nimble culture at NV will adjust quickly. Interesting times for gamers anyway, and little short term downside for NV's financials.

Unfortunately for NV, it does seem like RTG has got their act together and given the funding they needed to excel. Early leaks for RDNA3 have it another >50% perf/w leap vs RDNA2. If NV didn't wake up when they saw RDNA1 potential, they will be in a world of hurt at the time of RDNA3 vs Hopper. ie. For all we know, Hopper could be Volta-esque destined for datacenters only and NV relying on Ampere to carry them, expecting no competition from RTG.

Intel is toast until 22 when Keller-influenced new CPU design materializes as products.
 
  • Like
Reactions: KompuKare

Ajay

Lifer
Jan 8, 2001
15,406
7,833
136
Unfortunately for NV, it does seem like RTG has got their act together and given the funding they needed to excel. Early leaks for RDNA3 have it another >50% perf/w leap vs RDNA2. If NV didn't wake up when they saw RDNA1 potential, they will be in a world of hurt at the time of RDNA3 vs Hopper. ie. For all we know, Hopper could be Volta-esque destined for datacenters only and NV relying on Ampere to carry them, expecting no competition from RTG.

Intel is toast until 22 when Keller-influenced new CPU design materializes as products.

Well, RDNA3's efficiency will be helped by moving to TSMC 5N. Nvidia is likely doing the same for their next gen (Hopper, or whatever). NV can't stay on the same architecture. I think NV was expecting competition from AMD, just not the perf/watt improvement that AMD achieved. Supposedly, NV engineers were surprised at the efficiency of RDNA1 on TSMC N7. Unlike Intel, I believe that Nvidia is able to adapt much more quickly and will rebound in the next gen. My guess is that they have accelerated their time table for their next gen GPU - that will hurt their profits, but not as badly as losing out to AMD two generations in a row would.

Intel is likely toast till HVM of their 7nm products hits in 2023 - and they won't catch up till 5nm, if that node doesn't get delayed also.
 

Leeea

Diamond Member
Apr 3, 2020
3,617
5,362
136
Early leaks for RDNA3 have it another >50% perf/w leap vs RDNA2.

And here I was hoping my* launch day 6800xt would be the bees knees for at least two years! :)

*the odds of me even seeing a 6800xt this month or next seem to be depressingly low. The hype train seems to have chuffed right into the clouds of fantasy for me.
 
  • Haha
Reactions: lightmanek

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Intel is likely toast till HVM of their 7nm products hits in 2023 - and they won't catch up till 5nm, if that node doesn't get delayed also.

In GPUs, Xe might still be behind RDNA2/Ampere nevermind RDNA3 and Hopper in uarch.

They are going to need another substantial leap in perf/watt and possibly perf/mm2 before having any chance of competing with the two and this is if they have process parity!
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
In GPUs, Xe might still be behind RDNA2/Ampere nevermind RDNA3 and Hopper in uarch.

They are going to need another substantial leap in perf/watt and possibly perf/mm2 before having any chance of competing with the two and this is if they have process parity!

Intel doesn't need to compete with the high end, or even the upper mid range. The VAST majority of GPU's sold are low to mid range cards. Thats where intel needs to compete. And they don't even have to compete with cards sold at retail. Most GPU's are sold via OEMs. Which intel already has strong ties with.
 
Last edited:
  • Like
Reactions: Mopetar and ozzy702

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Intel doesn't need to compete with the high end, or even the upper mid range. The VAST majority of GPU's sold are low to mid range cards. Thats where intel needs to compete. And they don't even have to compete with cards sold at retail. Most GPU's are sold via OEMs. Which intel already has strong ties with.

Competing using price using an inferior uarch is a losing battle.

And as the third vendor, they have to be above everyone else just so they get everyone's attention and have a shot at taking any marketshare.
 

coercitiv

Diamond Member
Jan 24, 2014
6,176
11,808
136
And they don't even have to compete with cards sold at retail. Most GPU's are sold via OEMs. Which intel already has strong ties with.
The last time Intel tried entering a new market and competing on cost by using their strong OEM ties ended with billions in losses and complete retreat. And that was during the last stage of their golden era, when their brand image was intact. "Intel inside" meant a lot more back then than it does now.

Just as @IntelUser2000 said, they need the upper hand in perf/watt or perf/mm2 to establish themselves, otherwise they will burn money like they did with Atom on mobile and give up sooner rather than later.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,226
136
Competing using price using an inferior uarch is a losing battle.

And as the third vendor, they have to be above everyone else just so they get everyone's attention and have a shot at taking any marketshare.

Especially for Intel who has grown to expect fat margins, fatter than it's leaner competitors.

Inferior tech, means higher production costs, to deliver the same result, and then you have to charge less than competitors to sell, so a double hit to margins, then factor leaner margins of competitors to start with, so a triple hit to margins.

How long will Intel suffer lean margins before giving up?

I am skeptical Intel GPUs will ever really matter to game playing consumers. But I would be happy if they surprised to the contrary. More competition would be nice and actually it's most needed in the mid range, where prices are slow to move.
 
Last edited:
  • Like
Reactions: Elfear

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Competing using price using an inferior uarch is a losing battle.

And as the third vendor, they have to be above everyone else just so they get everyone's attention and have a shot at taking any marketshare.

I didn't say competing on price. I meant competing on performance, just in low to mid range GPU's, not the high end market. Trying to shovel an inferior product just on price wont work. But if they come up with a GPU that competed with say, an RTX 3050, where performance was equal, they could push AMD and nVidia out of some OEM deals. Specifically in the mobile space.
 

Mopetar

Diamond Member
Jan 31, 2011
7,826
5,969
136
Especially for Intel who has grown to expect fat margins, fatter than it's leaner competitors.

Inferior tech, means higher production costs, to deliver the same result, and then you have to charge less than competitors to sell, so a double hit to margins, then factor leaner margins of competitors to start with, so a triple hit to margins.

How long will Intel suffer lean margins before giving up?

I am skeptical Intel GPUs will ever really matter to game playing consumers. But I would be happy if they surprised to the contrary. More competition would be nice and actually it's most needed in the mid range, where prices are slow to move.

Intel doesn't sell any discrete cards right now so they don't have any margins. While I'm sure they'd like them to be similar to their CPUs, as long as they make a profit and can sustain the cost of developing their next generation of GPUs then they're making more money than they would be by not doing so.

If their product is only good as a bottom basement option then it's better for them to take that on slim margins than to not sell anything at all. If they had the capability to release a killer product that takes the crown right out of the gate then they would have had far better APU graphics for years. As long as they can cover their costs until they can get better it's fine for them to not be the best.
 

Hitman928

Diamond Member
Apr 15, 2012
5,232
7,773
136
Intel doesn't sell any discrete cards right now so they don't have any margins. While I'm sure they'd like them to be similar to their CPUs, as long as they make a profit and can sustain the cost of developing their next generation of GPUs then they're making more money than they would be by not doing so.

If their product is only good as a bottom basement option then it's better for them to take that on slim margins than to not sell anything at all. If they had the capability to release a killer product that takes the crown right out of the gate then they would have had far better APU graphics for years. As long as they can cover their costs until they can get better it's fine for them to not be the best.

What you say makes sense, but it won't go over well in the board room or at quarterly earnings report, I guarantee you that.
 
  • Like
Reactions: KompuKare and lobz

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
Where intel really need to compete is on perf/watt. A top end halo card really doesn't mean a lot, but if they can manage better integration with their CPUs and give laptops better battery life, that will be a win.
 

Mopetar

Diamond Member
Jan 31, 2011
7,826
5,969
136
What you say makes sense, but it won't go over well in the board room or at quarterly earnings report, I guarantee you that.

Are the board members so stupid that they'd turn down an increase in net profit? I don't doubt they have some idiots given the current state of Intel, but if Intel can make their graphics division self-sustaining then they'd be utter fools to kill it for no other reason than it can't hit margin levels for their long-standing core business area that didn't have any real competition for half a decade.
 

Hitman928

Diamond Member
Apr 15, 2012
5,232
7,773
136
Are the board members so stupid that they'd turn down an increase in net profit? I don't doubt they have some idiots given the current state of Intel, but if Intel can make their graphics division self-sustaining then they'd be utter fools to kill it for no other reason than it can't hit margin levels for their long-standing core business area that didn't have any real competition for half a decade.

Making money at a place like Intel doesn't mean anything, they make more money than they know what to do with. What the board room / stock holders want to see is them making more money at high profitability so that they can share in the profits in terms of dividends and share price.
 

Bouowmx

Golden Member
Nov 13, 2016
1,138
550
146
NVIDIA to enable in Ampere feature similar to AMD Smart Access Memory*, but for any CPU.
*A feature that increases gaming performance on Radeon RX 6000 series when paired with Ryzen 5000 series CPU.
Something to do with PCI configuration space and BAR (base address register). I didn't even know this is a thing.

 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,776
7,102
136
How could Nvidia have screwed up so many things in so many different areas so badly this launch? This would have been impossible to predict based on past performance

Cheaper (?) node with apparently poor characteristics.
Unbalanced design with too many compute units relative to rest.
Supply side disaster.
Terrible memory options. Too much or too little, choose.

-Yeah, it's like the directive inside NV was to defy expectations and everyone in the company interpreted it the wrong way.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Has anyone heard of an Nvidia response to the 6800 XT? I haven't. The 3080 is already effectively discontinued, as expected, for not offering enough memory for the price. The 3070 Ti is coming for $600 with 10GB. It's basically a 3080 for $100 cheaper and will try to compete with the 6800, but will be more expensive and have only 10GB compared to 16. The next card up is the rumored 3080 Ti with 20GB for $1000 to compete with the 6900 XT. That makes sense, but what will Nvidia do about the 6800 XT? The 6800 XT is the best card that makes the most sense of all of them and Nvidia has nothing to combat it. Their cards don't have enough ram or performance unless you spend $1000 or more. That doesn't do anything to address the 6800 XT with 3080+ performance and 16GB for only $650.
 
  • Like
Reactions: krumme