Question RTX 4000/RX 7000 price speculation thread

moonbogg

Lifer
Jan 8, 2011
10,574
2,943
136
My prediction: The entire generation will be 2-3X msrp on ebay and at retailers. RTX 3000 series will be sold along side the 4000 series because only a few will be buying RTX 4000 series who are willing to pay $1500 for what should be a $300 RTX 4060. Not enough supply to meet demand by a long shot, pricing will be through the Oort cloud. PC gaming is dead. Your thoughts?
 

Frenetic Pony

Member
May 1, 2012
172
111
116
Economics 101: When demand greatly exceeds supply, those with the supply can charge more than they normally could. Thus profits will increase.

Now this could be because: collusion and conspiracy. It's happened before with memory around, what mid to late 2000's? Or because supply is constrained by other factors. Now either the entire chip industry, multiple competitors across high end chips and car manufacturing and otherwise, dozens upon dozens of companies, have all gotten together all at once to collude and artificially limit supply. Or it's just because supply of various things is low versus demand and it takes a long, long time for supply to catch up.

Now here's an easy way to weed the truth out of most situations: Which scenario sounds easier to pull off? Because all those billions and billions of people in the world aren't a whole lot different from you, in that damn would they rather choose the much easier route given two different choices.
 
  • Like
Reactions: Tlh97 and Mopetar

Timorous

Golden Member
Oct 27, 2008
1,154
1,777
136
True, but if you don't program for it, you get a hit on the extra cache about 60% of the time. I'll take 50% more bandwidth all the time over that. Also, it doesn't scale well to higher resolutions because it improves performance by using data for multiple frames. So, as I understand it (and it seems to show in benchmarks), you get a nice boost when you're at higher FPS, i.e. lower resolutions, but it levels off as you get to higher resolutions and lower FPS (less frames able to share data from the L2 cache). My issue here is I'm not buying a 3070+ or 6800+ class GPU to play anything under 1440p personally. Now a 4080 Ti with a 384 bit memory bus plus some short of L2 cache, and 16GB+ VRAM is pretty exciting to me and I think would be a good 5+ year gaming card; might even be worthy of 1080 Ti type longevity comparisons later down the line.

Perfect example of the infinity cache helping AMD 6800 keep up with a nVidia 3080 at lower resolution, then get absolutely trounced as the resolution scales up.
View attachment 63909

Relatedly, the 3070 with the same 256 bit bandwidth, is essentially neck and neck with the 6800 by the time you get to 4k resolution. Neither one of them has the bandwidth to really perform at that resolution. I think having 50% more memory bandwidth is why the 1080 Ti aged better than the regular 1080 as well; more resources are always more after all. :)
This chart does not show what you think it does.

The 6800 vs 2080Ti show a similar performance drop off going from 1080p to 4K where as the drop off for the 3070 is lower (and at 4K with just 8GB of ram there might VRAM limiting issues there). The 3080 is either really underutilised at 1080p (and it would be good to have a 6800XT in there to see) or is slightly bottlenecked. If it is just shader utilisation though then as you can see it shows very little performance drops for increased resolution.

This shows that for higher resolutions the 6800 (and 6800XT / 6900XT) is actually short of compute performance rather than memory bandwidth. This is why the 6950XT did not gain 12% or so performance from its 12.5% increased memory clock. TPU did a good test and there the 6950XT has 6.4% more performance at 4K which is achieved with a 7% higher average core clock and the stated 12.5% increased memory clock. As you can see performance scaled nearly 1:1 with core clock.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,407
2,105
136

1599 on a 4090..... those are miner prices.... sigh...
1199 on a 4080 16GB. lolz... again Miner 3080ti prices.
899 on a 4080 12GB.

WTF did etherium merg wiping all the GPU miners do to prices?
Absolutely nothing.

Im with many others and hope etherium bottoms out and evaporates like Tera did, and VB goes to jail or at least a massive law suit on epic levels.
ETH is no longer a factor in GPU pricing as of 6 days ago. This is purely nvidia greed. They're trying to clear out 30 series by artificially inflating 40 series MSRP. I hope their sales are mediocre.
 

xpea

Senior member
Feb 14, 2014
428
131
116
The die sizes are smaller, overall, for AMD chiplets, which means you get more dies per wafer. That is the whole point of going chiplet. Its easier to get plenty of 350 mm2 dies from one wafer, than it is to get one large 700 mm2, that you have to sell then to customers.

Thats the main difference.
I did a quick check and you are wrong
AD102 die 600mm2 (24x25):
AD102 Die Per Wafer Calculator.png

You get 82 good dies per wafer for RTX4090

Navi31 GDC die 350mm2 (18x19)
Navi31 GDC Die Per Wafer Calculator.png

You get 156 good dies, which is 78 RX7900XT and you must add the IO die + interconnect

in this particular case, monolithic is better
 
  • Wow
Reactions: Grazick and psolord

maddie

Diamond Member
Jul 18, 2010
4,492
4,204
136
And yet Intel is raising prices. If nvidia has evidence that the elasticity of demand supports higher prices (that demand for nvidia GPUs is relatively inelastic), they’ll raise prices to offset rising supply chain costs.
The one market that should be insulated from economic troubles might be Server. It might even have a greater growth as companies slash costs. Client however, is going to see the biggest downward cost pressure, exactly where Intel has the best position. Exactly how many companies and individuals will upgrade as often with other monetary worries? Intel needs to keep the fabs running or else. Check back in a few months and see if the price increase sticks. The fabless companies are much better positioned for a downturn.

It seems to me, that you and some others, are using a couple of extremely distorted years of buying and pricing data to predict the rapidly changing future

edit:
Just wanted to add, that our European members, unless very old and maybe not even then, will be facing the worse economic scenario of their lifetimes, sorry to say. Remember this post for the future.
 
  • Like
Reactions: Ranulf

GodisanAtheist

Diamond Member
Nov 16, 2006
4,782
4,109
136
I am just curious. Are some of the Ampere deals being posted for the 3080 and up selling out? Before the crazy crypto pricing, the 3080ti was $599 right around the time the 2080ti was released. I am on the fence with graphics cards. I need one but my need is not urgent.

There is supposedly a 3080 that is a rebranded 3080ti/3090ti variant with 20GB of vram. The chip will not qualify for the top line GPU's but does just fine as a 3080. The 20GB of video ram is Micron stuff. I cannot remember why it's 20GB of VRAM or if the GPU is actually out in the wild. Something about the PCB of the board with extra vram lanes/slots. Otherwise they would have to junk the chips. So Nvidia is supposedly making a 3080 with 20GB of vram.

TSMC would not let Nvidia out of their massive silicon allotment. Which means they really cannot delay the 4000 series cards. In addition they have a ton of unsold GPU's looking for new homes.

Definitely interesting times for Nvidia. Moving from Samsung silicon to TSMC 4nm. The rumors are (4000series) cards have massive core clock speed increases over ampere. The performance gains are supposed to be huge. I have also heard the RDNA 3 GPU's are supposed to offer huge performance gains as well.

So if I go the 4000 series route. It would be the 4070 for me. I need 16GB or more of VRAM because I have a triple monitor setup.
- Just. Wait. Nothing in the current gen is going to be worth more than $500 after the next gen releases, no matter how much it "sells" for currently. IMO primo buying time if you don't want to wait another 3 months post launch for supplies to stabilize is likely going to be October/November. I anticipate Christmas shopping season firesales on current gen to make room for mainsteam next gen.
 

Hans Gruber

Golden Member
Dec 23, 2006
1,709
798
136
The 2080Ti dropped like a rock once the 3080 was released as well. It would be nice to see something similar with 30 series when 40 series launches, although I wouldn't be surprised if 4090 is the only one launched this year so they don't have to cut prices too much or too soon. Also wouldn't be surprised if the price of a 4090 was $2k.
Straight from anandtech circa 2017
.

I am trying to add a little context of what the graphics card market was like before the crypto craziness. The MSRP was $599 for the 1080 and then dropped to $499 when the 1080Ti was released.

I guess Nvidia can say what they want about inflation. If people are not buying because they are burned out from the crazy prices for GPU's. They can explain it to their shareholders at the meeting. The economy is in the tank.

I plan on watching the GPU market for the next 3 or 4 months. I am 6-8 months out from Zen 4 or another upgrade. The GPU comes first.
 

Justinus

Diamond Member
Oct 10, 2005
3,031
1,343
136
I don't know if that's going to be the case. We're already seeing a massive influx of GPU stock and GPU's staying in stock at rapidly decreasing prices. The market may not be back to normal by the time the new GPU's launch, but I think it will have changed significantly from the landscape of the previous launch in late 2020.
 

Shmee

Memory and Storage, Graphics Cards
Super Moderator
Sep 13, 2008
6,568
1,804
136
I don't know for sure, but I do kinda think prices should be decent, given the supposed performance increase for both series. My concern is the rumors of crazy power draw and new PSU connector types on some cards, which would be a non starter for me. The 3090 power draw is bad enough, I feel that should be the max of a single card out of the box.

Also, I saw a deal today, an MSI custom cooled 6900XT for $1049 at Microcenter. That actually sounds really good.
 
  • Like
Reactions: CP5670 and Tlh97

Ajay

Lifer
Jan 8, 2001
12,343
5,891
136
Welp, it'll depend on launch volume and mining performance. I'd expect a ~20% price hike just due to economic conditions. But if the new cards mine at twice the MHR/Watt - doomsday :(.
 

moonbogg

Lifer
Jan 8, 2011
10,574
2,943
136
I expect launch volume to be a joke and the cards will definitely make mining profitable again. The hash rate will be insane and will make up for the current loss in profits. If a 4060 performs anywhere near a 3080 and has an msrp close to $400, can you imagine the scalpers and miners gouging each other's eyes out in front of Best Buy during the drops? I can. That 4060 will sell on ebay for more than a 3080 does right now. These cards won't be available for at least 6 months after release, assuming Ethereum mining dies from proof of juicy steak.
 

biostud

Lifer
Feb 27, 2003
17,299
3,404
126
I do see the top cards to be an alternative to SLI/CF and being priced accordingly. But as long as the price/performance/power consumption ratio is reasonable I don't have a problem with that. Especially as it will work in every game and not have microstutter problems.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,407
2,105
136
I expect launch volume to be a joke and the cards will definitely make mining profitable again. The hash rate will be insane and will make up for the current loss in profits. If a 4060 performs anywhere near a 3080 and has an msrp close to $400, can you imagine the scalpers and miners gouging each other's eyes out in front of Best Buy during the drops? I can. That 4060 will sell on ebay for more than a 3080 does right now. These cards won't be available for at least 6 months after release, assuming Ethereum mining dies from proof of juicy steak.

ETH successfully merged PoS on the final testnet. I know PoS has been delayed by years, but it finally looks to be on track to be implemented this year. This looks to happen before AMD 7xxx and nvidia 40xx release so no, no matter what hashrate 40 series will have, there will be no coin to mine, mining will no longer be an issue.
 

CP5670

Diamond Member
Jun 24, 2004
5,308
482
126
I don't know for sure, but I do kinda think prices should be decent, given the supposed performance increase for both series. My concern is the rumors of crazy power draw and new PSU connector types on some cards, which would be a non starter for me. The 3090 power draw is bad enough, I feel that should be the max of a single card out of the box.
I wonder about this too. It seems like the 4090 will essentially need a new platform to support its power use, including a new PSU and maybe an AIO cooler as standard.

To be honest, the existing cards already run games very well. I'm not sure what I would even play on a new card that I can't currently play. Maybe VR simracing or RT without DLSS.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,407
2,105
136
I wonder about this too. It seems like the 4090 will essentially need a new platform to support its power use, including a new PSU and maybe an AIO cooler as standard.

To be honest, the existing cards already run games very well. I'm not sure what I would even play on a new card that I can't currently play. Maybe VR simracing or RT without DLSS.
Yeah, the power rumors at the top end are concerning. It's like back to Fermi days. However, midrange cards typically tend to have much more reasonable power consumption since they're not clocked/overvolted to the tilt. 3070 had roughly 2080ti performance while running 60W less. There is a good chance 4060ti/4070 will have better power/efficiency numbers than 30 series.
 

biostud

Lifer
Feb 27, 2003
17,299
3,404
126
Remember, once there was also tri and quad fire /SLI setups. They also consumed quite a lot of power.
 

Saylick

Platinum Member
Sep 10, 2012
2,363
4,439
136
Remember, once there was also tri and quad fire /SLI setups. They also consumed quite a lot of power.
Lol, I remember the days of dual-GPU/ROG Ares cards. Just threw power consumption out the window just for that extra 50% fps at like double the power. On the bright side, next gen cards appear to be better than 2x power for 1.5x performance. Top Lovelace appears to offer the same perf/W as Ampere with 2x perf and 2x power. Top RDNA3 appears to be a little more power efficient with 2.5x perf at 1.5-2x power.
 

moonbogg

Lifer
Jan 8, 2011
10,574
2,943
136

ETH successfully merged PoS on the final testnet. I know PoS has been delayed by years, but it finally looks to be on track to be implemented this year. This looks to happen before AMD 7xxx and nvidia 40xx release so no, no matter what hashrate 40 series will have, there will be no coin to mine, mining will no longer be an issue.
What's stopping someone from just making another mineable coin? They will call it Derpthereum and everyone will completely lose their minds and start selling their houses, harvesting their children's organs, robbing banks etc just to get GPU money so they can mine. Then scalpers and miners will engage in hand to hand combat in front of Best Buy again, ripping each other's throats out and tea bagging the fallen all for a 2% chance at getting a card at MSRP so they can flip it for 3X the money or mine on it. You can't see that happening? It's definitely happening. They can just keep making another coin. Why not?
 
  • Haha
Reactions: Tlh97 and psolord

Justinus

Diamond Member
Oct 10, 2005
3,031
1,343
136
What's stopping someone from just making another mineable coin? They will call it Derpthereum and everyone will completely lose their minds and start selling their houses, harvesting their children's organs, robbing banks etc just to get GPU money so they can mine. Then scalpers and miners will engage in hand to hand combat in front of Best Buy again, ripping each other's throats out and tea bagging the fallen all for a 2% chance at getting a card at MSRP so they can flip it for 3X the money or mine on it. You can't see that happening? It's definitely happening. They can just keep making another coin. Why not?
I won't claim to be a cryprocurrency expert by any means but there's a darn lot more going on than "here's a new GPU mineable currency, everyone jump aboard the profit train, choo-choo".
 

ultimatebob

Lifer
Jul 1, 2001
25,135
2,438
126
Even if the prices are high initially, they won't stay there for long.

Intel will be releasing their GPU's this summer, and Ethereum miners should be unloading their stockpiles around the same time once that crypto migrates to Proof Of Stake.

I'm hoping that the confluence of those two events will hopefully cause a supply glut and much cheaper prices.
 

ASK THE COMMUNITY