Question RTX 4000/RX 7000 price speculation thread

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
My prediction: The entire generation will be 2-3X msrp on ebay and at retailers. RTX 3000 series will be sold along side the 4000 series because only a few will be buying RTX 4000 series who are willing to pay $1500 for what should be a $300 RTX 4060. Not enough supply to meet demand by a long shot, pricing will be through the Oort cloud. PC gaming is dead. Your thoughts?
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,846
3,189
126
If Nvidia thinks they're going to be able to charge prices higher or probably even equal for the next generation, they're smoking some serious stuff.

But the MORE you buy, the MORE you save... remember?
He made it so obvious... the MORE you buy, the MORE you save... i mean look at it... the MORE we buy, the MORE we save.

- Church of Nvidia
Book of Jensen Chapter 1 verse 1.


On a serious note....
I think its gonna be down to AMD's true pricing.
Right now people are still buying overpriced video cards.
AMD has floorded some of their prices, but its not pre-covid prices, and i don't think we'll ever see that.
AMD seems to be wanting to clear out inventory they had from TSMC, and the RX7000 series will probably be in limited quantities, hence will also see an artificial inflation due to lack of supply, until all the RX6000 series are gone from said inventory.
 
  • Like
Reactions: ZGR and Makaveli

Aapje

Golden Member
Mar 21, 2022
1,382
1,864
106
I'll just state it outright. If Nvidia thinks they're going to be able to charge prices higher or probably even equal for the next generation, they're smoking some serious stuff. All indicators are pointing to an economic decline. Important to note is that this is happening in addition to the potential avalanche of used mining cards, which will only add to the downward pricing pressure. What they want and what they'll get are going to be very different.

Yeah, I also expect them to have to eat humble pie. The best bet for them to actually be able to ask these prices is hyperinflation
 

pakotlar

Senior member
Aug 22, 2003
731
187
116
Yeah, I also expect them to have to eat humble pie. The best bet for them to actually be able to ask these prices is hyperinflation

And yet Intel is raising prices. If nvidia has evidence that the elasticity of demand supports higher prices (that demand for nvidia GPUs is relatively inelastic), they’ll raise prices to offset rising supply chain costs.
 

maddie

Diamond Member
Jul 18, 2010
4,740
4,674
136
And yet Intel is raising prices. If nvidia has evidence that the elasticity of demand supports higher prices (that demand for nvidia GPUs is relatively inelastic), they’ll raise prices to offset rising supply chain costs.
The one market that should be insulated from economic troubles might be Server. It might even have a greater growth as companies slash costs. Client however, is going to see the biggest downward cost pressure, exactly where Intel has the best position. Exactly how many companies and individuals will upgrade as often with other monetary worries? Intel needs to keep the fabs running or else. Check back in a few months and see if the price increase sticks. The fabless companies are much better positioned for a downturn.

It seems to me, that you and some others, are using a couple of extremely distorted years of buying and pricing data to predict the rapidly changing future

edit:
Just wanted to add, that our European members, unless very old and maybe not even then, will be facing the worse economic scenario of their lifetimes, sorry to say. Remember this post for the future.
 
  • Like
Reactions: Ranulf

Frenetic Pony

Senior member
May 1, 2012
218
179
116
And yet Intel is raising prices. If nvidia has evidence that the elasticity of demand supports higher prices (that demand for nvidia GPUs is relatively inelastic), they’ll raise prices to offset rising supply chain costs.

The demand for nvidia GPUs and all GPUs has already proven to be incredibly elastic, there's no point in stating otherwise.
 
  • Like
Reactions: GodisanAtheist

jpiniero

Lifer
Oct 1, 2010
14,591
5,214
136
I think nVidia believes it's a supply issue, mainly because they thought they would be able to continue to sell GA102 past Ada's launch. That's why the low end and mid range aren't getting any price cuts since there won't be any refresh there any time soon.
 

Aapje

Golden Member
Mar 21, 2022
1,382
1,864
106
I think nVidia believes it's a supply issue, mainly because they thought they would be able to continue to sell GA102 past Ada's launch.

That makes little sense. The production costs are simply too high to discount it to a low enough level. Nvidia also doesn't like discounting products too much and exposing how much they lose value to people who only look at the new market (and ignore 2nd hand).

Keep in mind that the time between production and delivery to the shops is quite long, so once they noticed that the Ethereum downturn was permanent, there were already a huge number of cards in the pipeline.
 

Aapje

Golden Member
Mar 21, 2022
1,382
1,864
106
I think they figured they would continue to sell Ampere at MSRP to miners, while selling Ada to gamers.

Miners would also prefer the newer cards for their resale value, so this would only work if there was still a shortage and miners couldn't get anything else. But Nvidia predicted that the shortages would become much less.
 

Golgatha

Lifer
Jul 18, 2003
12,651
1,514
126
I'll just state it outright. If Nvidia thinks they're going to be able to charge prices higher or probably even equal for the next generation, they're smoking some serious stuff. All indicators are pointing to an economic decline. Important to note is that this is happening in addition to the potential avalanche of used mining cards, which will only add to the downward pricing pressure. What they want and what they'll get are going to be very different.

For real. Even right now I see several 12GB 3080 cards at my local Micro Center listed for $799. $1200 for a 3080 Ti is a pipe dream.
 
  • Like
Reactions: Ranulf

KompuKare

Golden Member
Jul 28, 2009
1,015
930
136
Miners would also prefer the newer cards for their resale value, so this would only work if there was still a shortage and miners couldn't get anything else. But Nvidia predicted that the shortages would become much less.
Maybe, but if all Ada cards where severely limited at mining through a combination of purposefully blowing some fuses etc., and due to relying on caches like RDNA2, then would miners want Ampere or Ada?

Them wanting the newer cards for resale value only makes sense if the new cards are actually able to mine well.
 

Aapje

Golden Member
Mar 21, 2022
1,382
1,864
106
Maybe, but if all Ada cards where severely limited at mining through a combination of purposefully blowing some fuses etc., and due to relying on caches like RDNA2, then would miners want Ampere or Ada?

There is no fuse that you can blow to merely stop gaming. That's not how it works. The only things that work is to reduce the bus size (which impacts gaming) or come up with a better variant of LHR and/or add LHR to the 4090.

Of course, that assumes that miners are still interested at that point.

Them wanting the newer cards for resale value only makes sense if the new cards are actually able to mine well.

Not if they want to sell them to gamers when they are done. This is especially relevant due to Proof of Stake.
 

KompuKare

Golden Member
Jul 28, 2009
1,015
930
136
Well I wasn't really thinking of anything as simple as LHR which Nvidia only added after Ampere was all finished. So presumably software/firmware.

However they have now had enough time to think about how to implement a true hardware lockout. So if it can be done - without affecting gaming loads - we may see it next gen. If for no other reason than they can then charge miners extra.
 

Aapje

Golden Member
Mar 21, 2022
1,382
1,864
106
Well I wasn't really thinking of anything as simple as LHR which Nvidia only added after Ampere was all finished. So presumably software/firmware.

However they have now had enough time to think about how to implement a true hardware lockout. So if it can be done - without affecting gaming loads - we may see it next gen. If for no other reason than they can then charge miners extra.

It can't be done. A videocard is specialized in certain kinds of calculations that happen to be both needed for rendering, but also for mining. You can't just make the videocard slow when doing a calculation for mining, but fast when doing that same calculation for rendering.

With LHR, the driver detects patterns in the calculation and reduces the memory bandwidth if it detects a pattern in those calculations that is typical for Ethereum mining. Mining a coin like Ravencoin results in a different pattern, which is why LHR doesn't affect that.

A common way to defeat LHR is to do a bit of mining and then do something else, then mine a bit and do something else, etc. Then the pattern is broken up so much that the LHR software in the driver no longer recognizes it as mining. Using this technique, you normally see a substantially lower hashrate because the mining software can't mine constantly, but has to put useless instructions in there too.
 
  • Like
Reactions: igor_kavinski

sze5003

Lifer
Aug 18, 2012
14,182
625
126
I'm curious how much I'll be able to sell my EVGA ftw 3080Ti soon after the new series come out. If stock is as bad as it was last time, I may come out alright assuming EVGA does another queue shortly before release.
 

Timorous

Golden Member
Oct 27, 2008
1,613
2,766
136
True, but if you don't program for it, you get a hit on the extra cache about 60% of the time. I'll take 50% more bandwidth all the time over that. Also, it doesn't scale well to higher resolutions because it improves performance by using data for multiple frames. So, as I understand it (and it seems to show in benchmarks), you get a nice boost when you're at higher FPS, i.e. lower resolutions, but it levels off as you get to higher resolutions and lower FPS (less frames able to share data from the L2 cache). My issue here is I'm not buying a 3070+ or 6800+ class GPU to play anything under 1440p personally. Now a 4080 Ti with a 384 bit memory bus plus some short of L2 cache, and 16GB+ VRAM is pretty exciting to me and I think would be a good 5+ year gaming card; might even be worthy of 1080 Ti type longevity comparisons later down the line.

Perfect example of the infinity cache helping AMD 6800 keep up with a nVidia 3080 at lower resolution, then get absolutely trounced as the resolution scales up.
View attachment 63909

Relatedly, the 3070 with the same 256 bit bandwidth, is essentially neck and neck with the 6800 by the time you get to 4k resolution. Neither one of them has the bandwidth to really perform at that resolution. I think having 50% more memory bandwidth is why the 1080 Ti aged better than the regular 1080 as well; more resources are always more after all. :)

This chart does not show what you think it does.

The 6800 vs 2080Ti show a similar performance drop off going from 1080p to 4K where as the drop off for the 3070 is lower (and at 4K with just 8GB of ram there might VRAM limiting issues there). The 3080 is either really underutilised at 1080p (and it would be good to have a 6800XT in there to see) or is slightly bottlenecked. If it is just shader utilisation though then as you can see it shows very little performance drops for increased resolution.

This shows that for higher resolutions the 6800 (and 6800XT / 6900XT) is actually short of compute performance rather than memory bandwidth. This is why the 6950XT did not gain 12% or so performance from its 12.5% increased memory clock. TPU did a good test and there the 6950XT has 6.4% more performance at 4K which is achieved with a 7% higher average core clock and the stated 12.5% increased memory clock. As you can see performance scaled nearly 1:1 with core clock.
 

biostud

Lifer
Feb 27, 2003
18,249
4,760
136
So any guesses on performance/$ improvements over current generation on the RTX 4080 vs 3080 and 7800XT vs 6800XT at current prices?
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,846
3,189
126
So any guesses on performance/$ improvements over current generation on the RTX 4080 vs 3080 and 7800XT vs 6800XT at current prices?

We talking about scalper prices?
Because i find it really hard to believe you will be able to get one without having to go though a scalper or some queue 4.0

And scalpers have just about given up on the gen 30, and 6000 series, so those prices are dropping like flies.
 

biostud

Lifer
Feb 27, 2003
18,249
4,760
136
We talking about scalper prices?
Because i find it really hard to believe you will be able to get one without having to go though a scalper or some queue 4.0

And scalpers have just about given up on the gen 30, and 6000 series, so those prices are dropping like flies.
Just queue 4.0 :p
 

Frenetic Pony

Senior member
May 1, 2012
218
179
116
We talking about scalper prices?
Because i find it really hard to believe you will be able to get one without having to go though a scalper or some queue 4.0

And scalpers have just about given up on the gen 30, and 6000 series, so those prices are dropping like flies.

C'mon man, what is this constant stuff here. Scalpers are gone, you can go buy stuff on ebay for a good deal below MSRP, the world changed again and doesn't look like it's going back to the bad days anytime soon. We could all use the good news and be happy about it after two years of bad.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
C'mon man, what is this constant stuff here. Scalpers are gone, you can go buy stuff on ebay for a good deal below MSRP, the world changed again and doesn't look like it's going back to the bad days anytime soon. We could all use the good news and be happy about it after two years of bad.

The instant Jensen announces a 4000 series product, that same product will magically appear on ebay at 2-3x the msrp and retail stock will be nonexistent. It will stay that way for weeks if not months. Price doesn't matter either. If it's a 4090 at $2000 msrp, it will be an ebay exclusive at $4500.