Question are video card prices headed down yet?

Page 125 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DeathReborn

Platinum Member
Oct 11, 2005
2,746
741
136
Merely 125% on a 43" TV. As I wrote, there really isn't a rational explanation.

I do agree 200% would be a bit pointless.

I have a 28" 4K & 27" 1440p and I use 125% scaling on the 4K, my dad uses 150% on his 32" 4K panel. I found 32" to take up a little too much real estate for comfortable use.
 

jpiniero

Lifer
Oct 1, 2010
14,597
5,215
136
I read TSMC's earnings report and they only talk about N7/N6 having any sort of demand weakness. Now I didn't get the impression that they are planning on doing N7/N6 price cuts but I suppose it could happen. But it definitely wouldn't apply to N5/N4.

Also the Arizona fabs cost a lot more to construct so it sounded like wafers fabbed there will cost more.
 

ChiefBigFeather

Junior Member
Jul 15, 2018
24
9
81
I suspect this brings a long term development to it‘s conclusion. Margins have been way bigger in the professional space, and that sector has been growing for some
time. At the same time, with mobile
and consoles growing, the DIY market shrank. The mining craze showed that companies are really missing out in the midrange department. AMD was not really able to grow marketshare on better price/performance anyway. With the death of moor‘s law, you don‘t have to fear massive punishing slides in price/performance anymore.

Bottom line: From the perspective of the companies, why shouldn‘t they milk the DIY space more? The shills will buy anyway and the normies who buy every 4-8 years didn‘t bring in much money anyway.

I hate it too, but I think from the perspective of the companies it makes sense. On the other hand, AAA games mostly suck anyway. If the generational gains of GPUs slow as much as CPUs while there are fewer and fewer games that make higher end GPUs desirable, I‘m happy to slow my upgrade cycle from every 6 to every 10 years.
 
  • Like
Reactions: IEC and DAPUNISHER

biostud

Lifer
Feb 27, 2003
18,250
4,762
136
What I'm wondering is, that the current generation hasn't really improved on the price/performance ratio vs last generation (except for the xx90 or x900 class cards). So what will happen next generation? There must be an upper limit to what they can sell video cards for, and if these prices are going to be the "new normal" then next generation will have to have a better price/performance ratio. Or do you think they will have to lower the prices of the current gen at some point?
 
  • Like
Reactions: Ranulf

Aapje

Golden Member
Mar 21, 2022
1,382
1,864
106
Right now they are still benefiting from people who postponed their upgrade due to the mining boom prices and decided to wait for the new generation once that ended, but for whom Ampere price/performance was already acceptable or good. And also people who are willing to pay a lot to get more performance than you could get with Ampere and who go for the 4090 and 4080.

Yet I have my doubts to the size of these groups. So I foresee fairly low sales if they keep this up, especially for the second year of this generation (where they traditionally already do refreshes that improve price/performance a bit to boost demand). Although I expect that sales will already become quite low going into the second half of this year.

Keep in mind that if they keep very similar price/performance for generation after generation, sales won't just stagnate, but go down further, as fewer and fewer buyers will then still have a card from before the stagnation. Eventually nearly everyone will just keep their card until it breaks or switch to really long upgrade cycles.

Ultimately it is hard to imagine things staying this way. Nvidia and AMD basically have to accept really low sales, even if it's much easier to get wafers. It's also really easy for a competitor like Intel to disrupt the market. With stagnant price/performance they don't really have to innovate much. Just create a design optimized for value and then accept low margins, which they can if they sell the chip for many more than just 2 years. Since price/performance is stagnant, there is no reason to release a product every 2 years.
 
Last edited:

Aapje

Golden Member
Mar 21, 2022
1,382
1,864
106
Anyway, my observation that I shared before is that Nvidia seems to react to the sales of the previous gen. The 1000-series was a great improvement in price/perf and sold like gangbusters, so they provided a poor increase in price/perf for the 2000-series. Then they seemed to aim for a very good price/perf increase for the 3000-series, as evidenced by the 3080-price/performance. Yet that generation sold outrageously due to the mining boom. So the 4000-series then got an extremely poor price/perf improvement for the most part.

So my expectation is that Nvidia will overcompensate again for the 5000-series, although the higher wafer-prices will restrict how good a deal they can give.
 
  • Like
Reactions: scineram

biostud

Lifer
Feb 27, 2003
18,250
4,762
136
Anyway, my observation that I shared before is that Nvidia seems to react to the sales of the previous gen. The 1000-series was a great improvement in price/perf and sold like gangbusters, so they provided a poor increase in price/perf for the 2000-series. Then they seemed to aim for a very good price/perf increase for the 3000-series, as evidenced by the 3080-price/performance. Yet that generation sold outrageously due to the mining boom. So the 4000-series then got an extremely poor price/perf improvement for the most part.

So my expectation is that Nvidia will overcompensate again for the 5000-series, although the higher wafer-prices will restrict how good a deal they can give.
It does seem like every 2nd generation is better price performance. The 2xxx series got a new feature RTX/DLSS and the 4xxx series just got more performance without better price/performance ratio.
 

Aapje

Golden Member
Mar 21, 2022
1,382
1,864
106
It does seem like every 2nd generation is better price performance. The 2xxx series got a new feature RTX/DLSS and the 4xxx series just got more performance without better price/performance ratio.

The 4000-series gives us a new feature as well, in frame generation, so it actually has better price/perf than any generation before it!!!!111

The sad part is that Jensen might actually believe that.

The biggest reason I see why the 5000-series might not be at least good value, is if they have to make a huge price-adjustment for this generation and already overdo it. However, I predict the opposite, where they gradually lower prices and stop once sales are just mediocre.
 
  • Like
Reactions: biostud

biostud

Lifer
Feb 27, 2003
18,250
4,762
136
The 4000-series gives us a new feature as well, in frame generation, so it actually has better price/perf than any generation before it!!!!111

The sad part is that Jensen might actually believe that.

The biggest reason I see why the 5000-series might not be at least good value, is if they have to make a huge price-adjustment for this generation and already overdo it. However, I predict the opposite, where they gradually lower prices and stop once sales are just mediocre.
I just upgraded my video card, so I'll just follow from the sideline the next 4 years until a new upgrade is imminent :p
 

ChiefBigFeather

Junior Member
Jul 15, 2018
24
9
81
What I'm wondering is, that the current generation hasn't really improved on the price/performance ratio vs last generation (except for the xx90 or x900 class cards). So what will happen next generation? There must be an upper limit to what they can sell video cards for, and if these prices are going to be the "new normal" then next generation will have to have a better price/performance ratio. Or do you think they will have to lower the prices of the current gen at some point?
I think the margins will mostly stay this way. Next gen will bring some price/perf uplift, but it will probably be some boring small percentage like in the CPU space.
 

jpiniero

Lifer
Oct 1, 2010
14,597
5,215
136
Keep in mind that if they keep very similar price/performance for generation after generation, sales won't just stagnate, but go down further, as fewer and fewer buyers will then still have a card from before the stagnation. Eventually nearly everyone will just keep their card until it breaks or switch to really long upgrade cycles.

That's what happens with Moore's Law being dead.

I'm hopeful that GDDR7 will help (help Ada's performance at least) a bit... but it's just tough to say how much at this point & whether that would be enough to move people to buy. Assuming that the prices don't change.
 

Aapje

Golden Member
Mar 21, 2022
1,382
1,864
106
That's what happens with Moore's Law being dead.

I'm hopeful that GDDR7 will help (help Ada's performance at least) a bit

But Ada already got a huge boost per mm2 compared to Ampere. And the current price increases are simply unexplainable by stagnating price/perf of the nodes. The highest estimate of the difference per mm2 between Samsung and TSMC is 2x the cost, but the 4080 chip is much smaller than the 3080 chip, so you can't explain the price increase of 70% even if the entire BOM cost of the 4080 is determined by the cost of the chip, which it of course isn't.

Price gouging is the only thing that makes sense.
 

jpiniero

Lifer
Oct 1, 2010
14,597
5,215
136
But Ada already got a huge boost per mm2 compared to Ampere. And the current price increases are simply unexplainable by stagnating price/perf of the nodes. The highest estimate of the difference per mm2 between Samsung and TSMC is 2x the cost, but the 4080 chip is much smaller than the 3080 chip, so you can't explain the price increase of 70% even if the entire BOM cost of the 4080 is determined by the cost of the chip, which it of course isn't.

The 3080 was extremely far cut though. You really should compare the 4080 to GA103. Still, they could stand to cut the price a bit. But we're talking to like $999.
 

Aapje

Golden Member
Mar 21, 2022
1,382
1,864
106
GPU price follows cryptocurrency prices(mostly BTC and ETH.)

No longer. Mining is mostly irrelevant now.

The 3080 was extremely far cut though. You really should compare the 4080 to GA103. Still, they could stand to cut the price a bit. But we're talking to like $999.

The cut doesn't change the wafer costs of the chip. Nvidia should have significantly higher yields with TSMC anyway and if you think that they are throwing away a significant number of AD103 chips, then I have a bridge to sell you.

I think that if Nvidia were to price the 4080 as aggressively as the 3080, they would end up between $799 and $899.

Note that you seem to agree that they are price gouging for $200, which is quite significant.
 

maddie

Diamond Member
Jul 18, 2010
4,740
4,674
136
That's what happens with Moore's Law being dead.

I'm hopeful that GDDR7 will help (help Ada's performance at least) a bit... but it's just tough to say how much at this point & whether that would be enough to move people to buy. Assuming that the prices don't change.
Wow, you're making a believer out of me. I swear I see some channeling of Jensun Huang here.

So what do we have to forestall his death?

1) The it's either alive or dead is trivial nonsense. Old age/Decline? Yes. This is not a binary choice.
2) Slowest scaling to highest are ---- IO, cache, logic.
3) Disaggregate IO & cache + use optimized libraries to further enhance density.
4) use advanced node to continue logic scaling.

Cache ~= 1/2 area of monolithic, can be scaled if you do as the V-cache tech showed and reduce your footprint by 50%. By using this, we do have an extension of good old Moore. Is less steppings for cache an additional factor further working to extend Moore's Law? Fabbing cost and all that stuff. Do we ignore the fact that a wafer has to use the # of steppings of the most complex sub-units even if certain circuitry on a monolithic die do not require them?

I won't even pursue your elevated wafer cost claims that somehow have no supporting data and are directly contradicted by a few here who claim insider knowledge.
 

coercitiv

Diamond Member
Jan 24, 2014
6,201
11,902
136
The 3080 was extremely far cut though. You really should compare the 4080 to GA103.
The high sales volume of the 3080 is diametrically opposed to your argument, as it implies big losses in manufacturing due to poor yields. But for the sake of the argument, let's look further down the stack:

3060 12GB ~ $330 for a card using a 276mm2 die.
4070Ti 12GB ~ $800 for a card using a 295mm2 die.

Even if we double every manufacturing cost, we're still roughly $100 short of the price Nvidia is asking AFTER they unlaunched the card to sell it cheaper. And remember this means that even screws in Nvidia cards are nearly 100% more expensive.

Cache ~= 1/2 area of monolithic, can be scaled if you do as the V-cache tech showed and reduce your footprint by 50%.
For anyone who has yet to stumble on this talk by Jim Keller, I strongly recommend watching it.


To summarize what Keller is trying to tell people there, a chipmaker that believes high cost increase is unavoidable will execute exactly on this path.
 

jpiniero

Lifer
Oct 1, 2010
14,597
5,215
136
3060 12GB ~ $330 for a card using a 276mm2 die.
4070Ti 12GB ~ $800 for a card using a 295mm2 die.

Even if we double every manufacturing cost, we're still roughly $100 short of the price Nvidia is asking AFTER they unlaunched the card to sell it cheaper. And remember this means that even screws in Nvidia cards are nearly 100% more expensive.

The wafer price difference between SS8 and 4N is likely more than double. After all, they are getting 2.7x gain in transistor count between the two. You can't compare die sizes anymore as a relative cost.

Edit: I do think nVidia is going back to Samsung for the next gaming chip. Problem is I don't know if they can really do that much better than Ada there. It should be able to be cheaper though.
 
Last edited:

Heartbreaker

Diamond Member
Apr 3, 2006
4,227
5,228
136
The wafer price difference between SS8 and 4N is likely more than double. After all, they are getting 2.7x gain in transistor count between the two. You can't compare die sizes anymore as a relative cost.

NVidia has been nearly TSMC for life. That one diversion to Samsung was likely because they were given a wafer pricing offer they couldn't refuse, and it came with a big drop in prices.

Now they are back on TSMC and the pricing is meh...
 

biostud

Lifer
Feb 27, 2003
18,250
4,762
136
Bottom line: There is a severe lack of competition in the GPU/wafer production market.
 

amenx

Diamond Member
Dec 17, 2004
3,899
2,117
136
But Ada already got a huge boost per mm2 compared to Ampere. And the current price increases are simply unexplainable by stagnating price/perf of the nodes. The highest estimate of the difference per mm2 between Samsung and TSMC is 2x the cost, but the 4080 chip is much smaller than the 3080 chip, so you can't explain the price increase of 70% even if the entire BOM cost of the 4080 is determined by the cost of the chip, which it of course isn't.

Price gouging is the only thing that makes sense.
Add to that the elephant in the room that so many seem to overlook: 4090 pricing vs 3090, a very modest 7% gain in price. Yet that damn 4080 with the smaller chip goes up 70% (vs the large die 3080).
 

blckgrffn

Diamond Member
May 1, 2003
9,127
3,066
136
www.teamjuchems.com
RX 5600 XT (ASRock Challenger) "New" @ Newegg (shipped and sold by) for $199.99! (See my thread in Hot Deals)

What corner of their warehouse did these crawl out of?

And who would buy them at that price when the 6600 is right there?

If they were $150 or something I’d consider it more.

$99 and now we are talking a deal.