Question RTX 4000/RX 7000 price speculation thread

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

moonbogg

Lifer
Jan 8, 2011
10,643
3,102
136
My prediction: The entire generation will be 2-3X msrp on ebay and at retailers. RTX 3000 series will be sold along side the 4000 series because only a few will be buying RTX 4000 series who are willing to pay $1500 for what should be a $300 RTX 4060. Not enough supply to meet demand by a long shot, pricing will be through the Oort cloud. PC gaming is dead. Your thoughts?
 

ultimatebob

Lifer
Jul 1, 2001
25,134
2,449
126
I think with how tumultuous SKU designations have been across the last few generations it will be extremely hard to predict where the actual designations will lie, both performance and price wise. Given the 2080 was $800, and the 3080 was $700 (MSRP for FE), I think it would be a stretch to think Nvidia will take that designation and price it up to more than double historical standards.

That being said, I don't have a better guess at what is going to happen. IMO all we can do is wait and see.

I based my price estimate off of what RTX 3080's are going for on eBay now, plus a 10% price boost to account for the performance boost from the prior generation.

I don't think that MSRP's are going to come down from "Scalpoklypse" levels any time soon. Not until GPU crypto mining is a thing of the past, anyway.
 
Last edited:

moonbogg

Lifer
Jan 8, 2011
10,643
3,102
136
I'm pretty sure miners and scalpers everywhere will see the hashrate and completely lose their minds and just panic buy all the cards. It's going to be a launch disaster. I don't think mining going away will even be enough. The demand for GPUs is so insane it's a genuine mystery to me. How are so many people tripping over each other to buy a gaming card for $1600?
 

Aapje

Golden Member
Mar 21, 2022
1,521
2,087
106
Pricing is extremely dependent on supply. For example, lets imagine a simplified situation where each company only sells one GPU. AMD would like to sell this GPU for $1000, while Nvidia would like to sell a GPU that performs the same for $1200. And suppose that around that price point, there is demand for 20 million units, while AMD and Nvidia can each produce at most 12 million units.

Then if both companies would sell their GPU for their desired sales price, the AMD GPU would be much better value and everyone should then prefer to buy the one from AMD (assuming no fanboys or other irrational behavior). Yet AMD can only produce 12 million GPU's, so they would actually have to turn away 8 million customers and would get bad press because of it!

So they are much better off raising their prices to a point where at least 8 million people buy the Nvidia GPU. That means more profit for AMD, without any lost sales, since they couldn't produce those 8 million units anyway.

Nvidia has the opposite problem, where if they price their GPU at $1200, everyone would prefer the AMD GPU. Even if AMD can't satisfy that demand, people that would then be forced to buy an Nvidia card because they couldn't get an AMD card would be unhappy.

So the end result is that AMD is encouraged to raise their price and Nvidia to lower their price, until the price/performance is somewhat similar. However, the more units AMD can produce, the more they are encouraged to lower their price further, to steal more market share from Nvidia.

So the price depends not just on production costs, but also the production capability of the manufacturers and that is quite uncertain.

Also, in reality there is a range of GPU's and the performance of the competing GPU's is not the same. At the top end, there are a lot of people who want the best of the best, so the manufacturer that makes the best GPU can ask for a lot more than would otherwise be feasible. And with Intel entering the market we may see more supply on the low end than the top end, resulting in cheaper low end prices.

Of course, all of this is assuming that the shortages end, because otherwise the prices depend mostly on how bad the shortages are and the relative performance of the cards.
 

lakedude

Platinum Member
Mar 14, 2009
2,755
518
126
I'm revising my opinion. At launch the 4000 series will be vaporware, like a lot of products have been. The 4000 series should eventually cause the prices for the 3000 series to settle down but yeah, in the short term prices on the 4000 series will be high with limited availability.

But PC gaming is not dead, just a bit more expensive than we would prefer.

Moore's law has us spoiled expecting better cheaper products. Now better products might not be so cheap.

We are still spoiled regarding CPUs with prices falling well below MSRP now that Intel is back in the game.
 

moonbogg

Lifer
Jan 8, 2011
10,643
3,102
136
I don't think there is a GPU shortage. Nvidia has bragged about selling more cards than ever before with record breaking margins. Where did the cards all go then? To miners, of course. For gamers, there is a GPU shortage because another consumer base has bought them all. Miners have slowed down on their purchases, but retailers are hesitant to lower prices below those sweet scalper levels. I think the 3000 series will finally end up in gamers hands while the 4000 series remains totally irrelevant due to hilarious prices, scalpers, and yet more miners.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,198
126
but retailers are hesitant to lower prices below those sweet scalper levels.
... for NVidia cards. AMD cards, not so much.

RX 6500 XT selling for $139 USD equiv. without VAT in Germany, and on Newegg, RX 6700 XT for $599 or lower, and RX 6600 for $379.

but why is the $250 MSRP RTX 3050 selling for $400-500 still? That kind of boggles my mind.

I did manage to get the EVGA XC GAMING RTX 3050 for $329 + tax, in a bundle with an $80 non-discounted PSU.
 
  • Like
Reactions: Tlh97

moonbogg

Lifer
Jan 8, 2011
10,643
3,102
136
... for NVidia cards. AMD cards, not so much.

RX 6500 XT selling for $139 USD equiv. without VAT in Germany, and on Newegg, RX 6700 XT for $599 or lower, and RX 6600 for $379.

but why is the $250 MSRP RTX 3050 selling for $400-500 still? That kind of boggles my mind.

I did manage to get the EVGA XC GAMING RTX 3050 for $329 + tax, in a bundle with an $80 non-discounted PSU.

I think people distrust AMD GPUs. I know I do. I don't care about ray tracing, but it's a trust and reliability issue. I have this feeling that they won't work with VR as well, like there will be all kinds of odd stuttering and compatibility issues. I read some VR GPU benchmarks and AMD cards had all kinds of bad frametimes, hitches, stutters, etc. If I'm spending half of my children's college tuition on a gaming card, it had better at least work.
When it comes to AMD drivers, I get this feeling that the entire driver team is together in a mobile trailer extension thingy, maybe like 8 guys, and if a driver is janky but kind of works, they just call it good enough and go home.
 
  • Like
Reactions: GodisanAtheist

ultimatebob

Lifer
Jul 1, 2001
25,134
2,449
126
I'm pretty sure miners and scalpers everywhere will see the hashrate and completely lose their minds and just panic buy all the cards. It's going to be a launch disaster. I don't think mining going away will even be enough. The demand for GPUs is so insane it's a genuine mystery to me. How are so many people tripping over each other to buy a gaming card for $1600?

I'd hope that the new 4000 series Nvidia cards are low hash rate like the newer 3000 series cards, so they suck at mining.

Of course, they'll probably have a "mining optimized" SKU as well, because they also like printing money.
 

CP5670

Diamond Member
Jun 24, 2004
5,577
671
126
I want to try out an AMD card some time, but in a different PC than my main gaming box. I have too many old games at this point that I've set up around the Nvidia drivers and third party tools, and don't want to spend time getting it all working with AMD now.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,263
7,811
136
I think people distrust AMD GPUs. I know I do. I don't care about ray tracing, but it's a trust and reliability issue. I have this feeling that they won't work with VR as well, like there will be all kinds of odd stuttering and compatibility issues. I read some VR GPU benchmarks and AMD cards had all kinds of bad frametimes, hitches, stutters, etc. If I'm spending half of my children's college tuition on a gaming card, it had better at least work.
When it comes to AMD drivers, I get this feeling that the entire driver team is together in a mobile trailer extension thingy, maybe like 8 guys, and if a driver is janky but kind of works, they just call it good enough and go home.

- Yep. AMD can't just have one good showing, they need to have at least 3 generations of competitive performance and software before people start getting on board. RDNA 2 is their first "drama free" generation. Nvidia certainly has their issues, but they always hold the performance crown at all costs and tend to do a much better job with the ancillary graphics tech (VR for example) than the smaller AMD that tends to put all its resources into just getting their basic driver stack working right.

RDNA 1 had the driver complaints, Polaris/Vega had reasonably stable drivers but were not performance competitive and had a litany of broken feature promises due to faulty hardware, Fury/HD3xx series was just crushed by Maxwell in every conceivable way (and the 3xx series was a rebadge of the 2xx series which was a rebadge of the 7xxx series).

Last time AMD was solidly competitive with NV across the board was the 7xxx/2xx series, which was *checks watch* 8 or 9 years ago?

Its going to take some time to shake off the value brand image, and even that will be somewhat dependent on whether or not AMD remains competitive in the CPU space and doesn't have another Bulldozer decade.
 

Frenetic Pony

Senior member
May 1, 2012
218
179
116
Hmmm, I'll take a stab with going by tiers:

Top end (2x a 3090 or so): 2k+. This is the "I'm so rich I, almost, don't care" category. And/or studio stuff so corporate will be footing the bill. So just a vague guess here, lots of ram.
High end mainstream: $1-1.2k Quite a bit Faster than a 3090ti, but like 25% slower than the truly top end, less ram. For all the people that want almost the most game but can't quite hack that exorbitant price.

"X9X+" tier: $700. Faster than the fastest cards out now, but barely, yet for half even the ideal cost of thoday's top end cards.
"X8X" tier: $449. Yeah, a 3080 or whatever level for mid range. Woot woot.
"X7X" tier: $339. Great performance for a mainstream level cost
"X6X" tier: $259. Console level performance for a truly mainstream price; the triumphant return of the 580/1060 days.

This is assuming of course that the crypto zombie doesn't rise from the grave again, and that Intel can actually push out enough cards to get some competition in that mid zone, and that the chip shortage really is easing for stuff like GPUs. So a fairly optimistic scenario all around, but hey let's have some optimism.
 

eek2121

Diamond Member
Aug 2, 2005
3,158
4,528
136
OK, I'll put us back on topic. I think that the "4080" when it comes out will have an MSRP of $1,699. You're not going to be able to buy one of that price right away, though.

When I buy mine, I'll have to buy it with Dogecoin just to annoy Moonbogg :)

My official guess is that $799-$899 will be the retail price of an RTX 4080 equivalent.
 
  • Like
Reactions: DooKey

biostud

Lifer
Feb 27, 2003
18,890
5,775
136
It will be interesting to see how much Intel entering the video cards market will disrupt the pricing. They are probably not going to compete with top cards but if they can help keeping the prices of the midrange cards down, the price gap to the top will hopefully also be kept down.
 

Aapje

Golden Member
Mar 21, 2022
1,521
2,087
106
I expect the Intel drivers to be a complete shit-show, where many buyers will regret getting one. But those extra cards will satisfy some of the demand.

I hope that most of the Intel-chips will go to mobile, freeing up 30x0 chips for desktop cards.
 

moonbogg

Lifer
Jan 8, 2011
10,643
3,102
136
It also matters who launches first. If Nvidia launches before AMD and Intel, then they can just charge whatever lunatic price they want and deal with being competitive later, right?
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,851
2,813
146
From some rumors, it seems like AMD may have the fastest cards this time around. I guess time will tell.
 

Aapje

Golden Member
Mar 21, 2022
1,521
2,087
106
Intel isn't going to compete for the top-end and it's very doubtful that they can produce particularly cheaply, so Intel is going to adapt their prices to the competition, not vice versa.

And Intel planned for Q1, so I doubt that they will launch later than the others. AMD just released a refresh and given that they made a fully new architecture, I see Q4 as likely. Nvidia is probably going to beat them a little by launching in Q3, unless their cards have huge Q/A issues (which is possible, given the estimated power draw, that makes cooling hard).
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,263
7,811
136
No one knows the future, but my crystal ball says that Intel is going to bounce off the GPU market hard, and I wouldn't anticipate they ever really make a meaningful dent in the discreet GPU space.

They're launching mid-tier parts late into the current cycle, that will very quickly become entry level parts as the next cycle comes around. What they are launching is absurdly low volume, and will have a very hard time picking up any mindshare outside of the enthusiast/hobbyist segment of the market that will pick the GPUs up as curios/collectibles while using NV/AMD as their daily driver.

The current players are deeply entrenched, have soaked up basically any competent software dev talent in the field, and have a much larger % of their business dedicated to GPUs than Intel does (and thus will fight much harder to keep it than I suspect intel is willing to commit to get it). Intel has a terrible habit of dumping their ancillary projects/offerings if they are not profitable within a handful of business cycles either, and their GPU division does not appear to be ramping up.
 
  • Wow
Reactions: VirtualLarry

moonbogg

Lifer
Jan 8, 2011
10,643
3,102
136
I think the Intel cards would be best for mass market PCs where they can have all Intel stickers on it and claim high performance graphics that goes well beyond integrated graphics performance. I always expected Intel's venture into GPUs to really be a venture into more Intel stickers and nothing more. They will probably suck and have no impact on GPU pricing.
 

ultimatebob

Lifer
Jul 1, 2001
25,134
2,449
126
No one knows the future, but my crystal ball says that Intel is going to bounce off the GPU market hard, and I wouldn't anticipate they ever really make a meaningful dent in the discreet GPU space.

They're launching mid-tier parts late into the current cycle, that will very quickly become entry level parts as the next cycle comes around. What they are launching is absurdly low volume, and will have a very hard time picking up any mindshare outside of the enthusiast/hobbyist segment of the market that will pick the GPUs up as curios/collectibles while using NV/AMD as their daily driver.

The current players are deeply entrenched, have soaked up basically any competent software dev talent in the field, and have a much larger % of their business dedicated to GPUs than Intel does (and thus will fight much harder to keep it than I suspect intel is willing to commit to get it). Intel has a terrible habit of dumping their ancillary projects/offerings if they are not profitable within a handful of business cycles either, and their GPU division does not appear to be ramping up.

Yeah, I fear that Intel's launch is going to end up looking like their last attempt at a discrete graphics card back in 1998. By the time they get it out in volume, almost everything else on the market will outclass it.

I actually had an Intel i740 in my work system back then. It wasn't terrible, but a 3dfx Voodoo 2 could wipe the floor with it.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,263
7,811
136
I think the Intel cards would be best for mass market PCs where they can have all Intel stickers on it and claim high performance graphics that goes well beyond integrated graphics performance. I always expected Intel's venture into GPUs to really be a venture into more Intel stickers and nothing more. They will probably suck and have no impact on GPU pricing.

- Yeah, at *best* I figure Intel GPUs will live on as OEM add-ins for the "All Intel" pre-built boxes. Everyone keeps chasing Apple's walled garden, the darling of investors everywhere.

Intel really missed the boat to be a big player in the GPU space back in 1998/99 when they aborted their initial (terrible) forays into the 3d accelerator space, and the market basically matured and left them behind.

If it was tough for intel to break in way back then, the challenge is basically insurmountable at this point.
 

Aapje

Golden Member
Mar 21, 2022
1,521
2,087
106
@GodisanAtheist

Intel claims to be able to ship 4 million this year, which is not absurdly low volume if they actually achieve that. It seems like a decent base volume as a practice run, as the intent was always for the next generations to become competitive.

The current players are deeply entrenched, have soaked up basically any competent software dev talent in the field,

And Intel has been poaching that talent. Also, Intel has been quite good at making deals with system builders, selling huge quantities of their CPUs to them in long term contracts, even when AMD had the better CPU's. They are well positioned to push their graphics cards into Dells and such, if they are half-way decent.

Intel has a terrible habit of dumping their ancillary projects/offerings if they are not profitable within a handful of business cycles either, and their GPU division does not appear to be ramping up.

They explicitly say that they are only now trying to go for the prosumer market, rather than for iGPU solutions. Your claim seems to be that they've failed at something that they've not even been trying to do so far, which makes no sense. Of course Intel didn't manage to make inroads into the market for discrete GPU's when they were only selling iGPU's. That doesn't prove that they will fail if they actually try.

And I think that you fundamentally misunderstand Intel's strategy and why this is important to them. Their plan is to offer a portfolio of dies that can relatively quickly be combined into a multi-die chip and then to produce those chips. So companies like Google, Amazon, Tesla, etc will be able to order custom chips for general processing, AI, graphics, etc; or a specific combination of tasks, by picking dies from the portfolio and adding them together for a multi-die chip that matches their needs extremely well.

GPU dies have proven to be quite good for non-graphics tasks as well, so this is an important part of that portfolio. So I don't see them abandoning this unless they abandon their strategy, but they are spending huge money on implementing this. It's not just a whim.
 

moonbogg

Lifer
Jan 8, 2011
10,643
3,102
136
@GodisanAtheist

Intel claims to be able to ship 4 million this year, which is not absurdly low volume if they actually achieve that. It seems like a decent base volume as a practice run, as the intent was always for the next generations to become competitive.



And Intel has been poaching that talent. Also, Intel has been quite good at making deals with system builders, selling huge quantities of their CPUs to them in long term contracts, even when AMD had the better CPU's. They are well positioned to push their graphics cards into Dells and such, if they are half-way decent.



They explicitly say that they are only now trying to go for the prosumer market, rather than for iGPU solutions. Your claim seems to be that they've failed at something that they've not even been trying to do so far, which makes no sense. Of course Intel didn't manage to make inroads into the market for discrete GPU's when they were only selling iGPU's. That doesn't prove that they will fail if they actually try.

And I think that you fundamentally misunderstand Intel's strategy and why this is important to them. Their plan is to offer a portfolio of dies that can relatively quickly be combined into a multi-die chip and then to produce those chips. So companies like Google, Amazon, Tesla, etc will be able to order custom chips for general processing, AI, graphics, etc; or a specific combination of tasks, by picking dies from the portfolio and adding them together for a multi-die chip that matches their needs extremely well.

GPU dies have proven to be quite good for non-graphics tasks as well, so this is an important part of that portfolio. So I don't see them abandoning this unless they abandon their strategy, but they are spending huge money on implementing this. It's not just a whim.

Humans will evolve back into fish before Intel graphics do anything but suck.
 
  • Like
Reactions: Tlh97
Mar 11, 2004
23,383
5,793
146
@GodisanAtheist

Intel claims to be able to ship 4 million this year, which is not absurdly low volume if they actually achieve that. It seems like a decent base volume as a practice run, as the intent was always for the next generations to become competitive.



And Intel has been poaching that talent. Also, Intel has been quite good at making deals with system builders, selling huge quantities of their CPUs to them in long term contracts, even when AMD had the better CPU's. They are well positioned to push their graphics cards into Dells and such, if they are half-way decent.



They explicitly say that they are only now trying to go for the prosumer market, rather than for iGPU solutions. Your claim seems to be that they've failed at something that they've not even been trying to do so far, which makes no sense. Of course Intel didn't manage to make inroads into the market for discrete GPU's when they were only selling iGPU's. That doesn't prove that they will fail if they actually try.

And I think that you fundamentally misunderstand Intel's strategy and why this is important to them. Their plan is to offer a portfolio of dies that can relatively quickly be combined into a multi-die chip and then to produce those chips. So companies like Google, Amazon, Tesla, etc will be able to order custom chips for general processing, AI, graphics, etc; or a specific combination of tasks, by picking dies from the portfolio and adding them together for a multi-die chip that matches their needs extremely well.

GPU dies have proven to be quite good for non-graphics tasks as well, so this is an important part of that portfolio. So I don't see them abandoning this unless they abandon their strategy, but they are spending huge money on implementing this. It's not just a whim.

4 million of what? If its 3.75 million of the small GPU that they've already been shipping, getting stuffed in low end laptops so it can claim to have a dGPU, then who cares, that's not going to change anything.

I'd actually guess that a lot of that talent was poached from Intel. Unless you're speaking purely about graphics rendering related talent. But even then, Intel has had a ton of those people for years. Management just didn't have them doing anything for Intel. Certainly Intel paying companies or offering similarly impossible to resist deals to OEMs can't be ignored, especially in light of the EU letting them off the hook for the relatively paltry fine they were supposed to pay for that stuff from 15+ years ago.

They've explicitly said they were going to do that, multiple times in the past 30 years. Its failed every single time because Intel thinks they're gonna re-invent the wheel. I'd say this time its different because they're just making a pretty traditional GPU (frankly it sure sounds like this is Vega 3...), and clearly have the desire to throw whatever business ideals they like at it. But that won't magically make it work. And frankly, Raja doesn't have a good track record of making that work. Plus, the argument that they're doing it for pro/HPC/etc markets is going to blow that up, since those markets are ditching the rendering parts of the GPUs. How's Intel gonna spin Raja's vision of things when their consumer graphics still suck, and its also limiting their other markets? They're gonna ditch GPUs again and make whatever the HPC markets that are their bread and butter, and go back to making small iGPU esque stuff for the OEMs.

Google, Amazon, Tesla, etc are designing their own hardware. I expect that future stuff like the entertainment/media stuff like Tesla in the Model S is going to look much more like consoles. And, Intel doesn't have a good track record of working with others. Their attempts at 3rd party foundry service? How's that gone? Intel burns their partners, maybe not as much as Nvidia but there's a reason why AMD was in every console other than the one that went ARM. They rely on anti-competitive practices that gave them near monopoly power and treasure chest of money to withstand times when they actually face competition, which is why they maintained the status quo for decades.

That's somewhat true, but that is changing as well. HPC markets are ditching much of the G part of the GPU because they have little to no use for it and so its just limiting them. And much of the other parts, like video processing, are basically already all about dedicated processing blocks (and if that changes, its much more likely to go to a FPGA that can be reconfigured).

That's not to say that graphics alone isn't enough. It absolutely is and will become even more important as we transition to even more integration of GUIs and things like that for AR/VR. But Intel will have to actually be able to provide good graphics performance for them to be successful there. Intel always spends huge money. They spent billions putting LEDs in clothes. How's that gone? Oh they gave up on it? They spent huge money on 10nm.
 
  • Like
Reactions: KompuKare and Tlh97