Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 20 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

maddie

Diamond Member
Jul 18, 2010
4,740
4,674
136
They will probably have around two years to do it though. And Intel is only competing on the low end and I would suspect they will need 1-2 years or more to get volume on mid and high end. Given the backlog of people wanting to upgrade, and AMD being on TSMC as well, I don't suspect NV is in all that much trouble.

However, I can imagine that they might suffer on some quarterly results because sales might be 'slow', but getting rid of the chips for acceptable prices during the lifetime of 4xxx I don't think will be an issue.
I don't see them in trouble. Just arguing against the assumptions that prices are inevitably headed even higher, because....... Even with lower margins, Nvidia will still be profitable. Share price might correct, but the company is very sound.
 

gdansk

Platinum Member
Feb 8, 2011
2,083
2,576
136
Nvidia has both products on N5 derived processes. I doubt data center demand will go down.

Consumer demand will be high if you can get a cut down for 102 for $700 again. It was shown there is enough disposable income to support easily a $1200 MSRP. If they target a low MSRP they are stuck giving profits to their partners again. They will price high and fix with a refresh if sales are not brisk enough.

Demand for Hopper is enough to eat most the $10 billion contract over two years.
 
Last edited:

amenx

Diamond Member
Dec 17, 2004
3,899
2,115
136
I think consumers will simply change their buying habits if prices are too high. Upgrading every gen is no longer a viable option for me. Can easily skip a gen without breaking a sweat and overall cost remains the same as buying every gen when prices were cheaper. Console owners dont seem to mind several year stretches for their upgrades and I think GPU owners may slowly develop a similar buying pattern.
 
  • Like
Reactions: VirtualLarry

Frenetic Pony

Senior member
May 1, 2012
218
179
116
I think consumers will simply change their buying habits if prices are too high. Upgrading every gen is no longer a viable option for me. Can easily skip a gen without breaking a sweat and overall cost remains the same as buying every gen when prices were cheaper. Console owners dont seem to mind several year stretches for their upgrades and I think GPU owners may slowly develop a similar buying pattern.

See this is why GPU prices are probably going to come down. Crypto appears a dead horse for the foreseeable future, so the only demand is gamers. They need convincing that their hobby is reachable again.

Can already go get a 6900xt appreciably under msrp, 6750xt is in stock at msrp. That Nvidia cards still cost above shows how well their PR works, so effective at convincing people functionally nigh equivalent products are somehow not due to whatever bullshit they're spinning now. Really it's amazing how price insensitive they can get people, convincing them that entirely arbitrary driver labels or something will ruin their lives. Though I do think half the "upgrade all the time" market is buying solely for the fantasy of prestige, rather than actually running games at all. But hells, that's why 99% of fast cars are sold as well, no you're never going to drive it fast, and no you wouldn't be good at it if you did and you'd probably die. But the fantasy that you could and are a cooler, more fulfilled person for it is powerful.

And thus the point remains that it's gamers buying these things up, not crypto miners, who don't care about anything but hash rate versus price.

With the chip shortage slowly ending for high end nodes (we can see Sony ramping up PS5 production this year "significantly" and etc.) we'll be getting good GPU prices. Heck maybe games that actually make use of such GPUs will be available soon as well :eek::p
 
  • Like
Reactions: Saylick

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I think Nvidia/AMD must be very aware of all those on older cards (e.g. 1060/1070) that haven't upgraded through the whole mining crazyness, and simply won't spend $500+ on a gpu particularly in these harder times. That's the major upgrade market. They are going to be looking for something new and aren't as easy a sell as the enthusiasts for whom gpu's are one of their top disposable purchases. I also doubt the much quoted monster mining sell off will happen - it didn't really last time.
 
  • Like
Reactions: VirtualLarry

jpiniero

Lifer
Oct 1, 2010
14,591
5,214
136
I think Nvidia/AMD must be very aware of all those on older cards (e.g. 1060/1070) that haven't upgraded through the whole mining crazyness, and simply won't spend $500+ on a gpu particularly in these harder times. That's the major upgrade market. They are going to be looking for something new and aren't as easy a sell as the enthusiasts for whom gpu's are one of their top disposable purchases. I also doubt the much quoted monster mining sell off will happen - it didn't really last time.

There's always Ampere, and could do rebrands if necessary.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
Nvidia makes sure you get less for your money every generation.

Nonsense. Point to a generation where performance per dollar regressed, let alone came close to doing so. If you believe your dollar is more and more worthless, take it up with the people running the printing presses.

Even if you wanted to try to point to the recent shortages due to miners and the pandemic, that resulted in across the board increases and even 5 year old used cards were selling for more than their original MSRP.

Once you go back and adjust for inflation it's pretty obvious that even the top-end cards with their massive prices represent better value for money considering they deliver exceptional performance without the need for multiple cards. The capabilities have advanced enough that most people now consider 60 FPS to be a bare minimum acceptable level of performance and some people don't want to dip below 120 if they can help it.
 

maddie

Diamond Member
Jul 18, 2010
4,740
4,674
136
Nonsense. Point to a generation where performance per dollar regressed, let alone came close to doing so. If you believe your dollar is more and more worthless, take it up with the people running the printing presses.

Even if you wanted to try to point to the recent shortages due to miners and the pandemic, that resulted in across the board increases and even 5 year old used cards were selling for more than their original MSRP.

Once you go back and adjust for inflation it's pretty obvious that even the top-end cards with their massive prices represent better value for money considering they deliver exceptional performance without the need for multiple cards. The capabilities have advanced enough that most people now consider 60 FPS to be a bare minimum acceptable level of performance and some people don't want to dip below 120 if they can help it.
moonbogg has been having fun trolling us for the past several weeks. I keep waiting to see him "out-outrageous" his previous outrageous post, aaaand repeat.
 

gdansk

Platinum Member
Feb 8, 2011
2,083
2,576
136
2080 Ti was less performance per dollar than the 1080 Ti. Wasn't it? 30% faster, 70% more expensive. Turing as a whole was pretty poorly priced until Super refresh. And that was after the first coincrash.

Personally I appreciate @moonbogg's level-headed analysis of the GPU market. It's some of the most predictable.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
moonbogg has been having fun trolling us for the past several weeks. I keep waiting to see him "out-outrageous" his previous outrageous post, aaaand repeat.

I think it's Poe's law, but I'm not certain of this, that says at some point it becomes impossible to distinguish between extremist viewpoints and satire of such.

Even if he were being satirical, he'll eventually attract a crowd that echos the same message, only with complete sincerity. Maybe he's gone on that warpath himself. Someone ought to propose a wager involving cat food just to see if he's serious.
 

jpiniero

Lifer
Oct 1, 2010
14,591
5,214
136

Speculation that the 4070 will have it's bus cut to 160. Which means 10 GB with 2 GB chips. That L2 cache is going to have to be pretty magical for this to be approaching 3080 Ti performance.
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,699
136
It would really be shocking if the 4080 launched with GDDR6 and only 76% of the memory bandwidth of the 3080. 4070 on GDDR6 is less surprising, but having the same bandwidth as a 3060 with twice the cores...
You're right, that would need to be some impressive cache.
 

amenx

Diamond Member
Dec 17, 2004
3,899
2,115
136
Has anyone considered the possibility Nvidia may be leaking false info to throw off the competition?
 

jpiniero

Lifer
Oct 1, 2010
14,591
5,214
136
Has anyone considered the possibility Nvidia may be leaking false info to throw off the competition?

I assume the model configurations are real but the model naming is pure speculation. Even so, it's probably pretty close to the end result.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,808
7,162
136
Has anyone considered the possibility Nvidia may be leaking false info to throw off the competition?

- All rumors should really be considered false until they're not. We've seen an impressive display of guesswork for both RDNA 3 and Ada to date.

The fun comes from assuming they're true.
 

amenx

Diamond Member
Dec 17, 2004
3,899
2,115
136
OK, more guesswork... :D
If 4070 = 160bit/10gb it may turn out as a 4060 in the end.
Similarly = 4080 = 4070
and "real" 4080 may be a 320bit/20gb card.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
It would really be shocking if the 4080 launched with GDDR6 and only 76% of the memory bandwidth of the 3080. 4070 on GDDR6 is less surprising, but having the same bandwidth as a 3060 with twice the cores...
You're right, that would need to be some impressive cache.

Navi 21 has 66% of the bus width (and closer to 55% of the memory bandwidth due to the slower VRAM) as GA102 and AMD is able to hang with NVidia.

It's hardly unreasonable for NVidia to be able to see similar results from adding more cache of their own. My suspicion is that they have some versions of the card with faster memory to upsell consumers.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,339
10,044
126
I also doubt the much quoted monster mining sell off will happen - it didn't really last time.
I'm starting to wonder. WCCFTech had an article about Asian internet cafes / miners selling off their worn-out inventory of cards really cheap, and how some of them actually had defective RAM chips! (Which apparently can be mapped out if you're mining, don't know about gaming.)

I think 1660-family cards, which are low-power and efficient, might be worth getting in a big sell-off, for perhaps $100, doubt that they will go cheaper... but they might. I think RX 580 8GB cards might hit $50, moreso if they have a busted fan or two.

Of course, everybody wants an RTX 3080 / 12GB / Ti, for not mucho dinero.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
I'm starting to wonder. WCCFTech had an article about Asian internet cafes / miners selling off their worn-out inventory of cards really cheap, and how some of them actually had defective RAM chips! (Which apparently can be mapped out if you're mining, don't know about gaming.)

I think 1660-family cards, which are low-power and efficient, might be worth getting in a big sell-off, for perhaps $100, doubt that they will go cheaper... but they might. I think RX 580 8GB cards might hit $50, moreso if they have a busted fan or two.

Of course, everybody wants an RTX 3080 / 12GB / Ti, for not mucho dinero.

If crypto never existedI kid you not, a 3090 would be $500-600 by now. Samsung provided nvidia with a huge discount on its wafers and while the Samsung 8nm process isn't very energy efficient or high clocking, it IS fairly dense and they can fit a lot on one wafer.
 
  • Like
Reactions: Lodix and psolord

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,699
136
That seems unlikely considering I just had to spend $150 buying two Raspberry Pi Zero W's yesterday. Even without mining there's still huge pressure on chip inventories and prices are high across the board. Even outside that, mining revenue was down and there wasn't a global chip shortage when Nvidia released the 2080 Ti in 2018 for $1200 and outside a couple people panic selling them before the 3070 launch even used prices for that didn't go sub 500.

Who knows though, you might actually be able to get a 3090 for that price by Christmas if the 4000 series is much better than expected and there's nothing at all profitable to switch to once Eth goes POS just because miners are needing to clear out inventory.