- Sep 5, 2003
- 19,458
- 765
- 126
Sales of desktop graphics cards hit 10-year low in Q2 2015
"Shipments of discrete graphics adapters for desktops dropped to 9.4 million units in Q2 2015, which is minimum amount in more than ten years. According to JPR, sales of graphics cards dropped 16.81 per cent compared to the previous quarter, whereas sales of desktop PCs decreased 14.77 per cent. The attach rate of add-in graphics boards (AIBs) to desktop PCs has declined from a high of 63 per cent in Q1 2008 to 37 per cent this quarter. Average sales of graphics cards have been around 15 million units per quarter in the recent years, but declined sharply in 2014."
Remember, this article is discussing Volume/Unit sales not profits or revenues.
----------
Nvidia's quarterly record profits and revenues in the graphics industry have continued to hide the ugly truth of desktop discrete market segment as a whole -- desktop dGPU market is declining at a rapid pace with no end in sight to this alarming trend.
It's not a surprise that NV/AMD are becoming forced to raise prices, and as a result we've already seen the new upper mid-range GPUs are now priced deeply in the $300-475 range (i.e., 290X/GTX970/390/390X/980), while the bare minimum new graphics card entry from NV, the GTX950 is priced at $159. The current state of the desktop discrete GPU unit sales may act as a barometer for a grim future. When NV/AMD cannot sell high enough volumes of GPUs to generate sufficient profits, they will be forced to raise prices on a per mm2/per die size and per specific grade level of a GPU (for example, what was once viable to sell at $150 now has to sell at $250). However, doing so means that less and less gamers are enticed to buy GPUs as they become less affordable. As volumes fall, NV/AMD are even more pressured to raise prices to justify the R&D and manufacturing costs, which in turn causes a vicious cycle of rising Average Selling Prices and falling demand.
Additionally, given the current state of world markets, as the USD strengthens against other world currencies, especially those in emerging markets and 3rd world countries, goods that sell in USD (i.e., NV/AMD graphics cards) will rise in prices relative to the wages/earnings power of gamers in non-USD earning countries. This will contribute to lower demand for desktop discrete graphics cards as well because that will also contribute to the lower affordability factor for GPUs. For example, in Canada cards like Fury X or GTX980Ti are approaching $1000 Canadian after taxes, and even bare minimum like 950 sells for $220+, and R9 390 is $430+ tax. Now while some hardcore PC gamers think this is fine, it seems the rest of the market doesn't agree.
(Just my opinion of course).
:'(
P.S. Now one could try to make the argument that there are less gamers worldwide playing games overall but while the desktop discrete graphics card market declined from a stable 15-20 million sales per quarter to sub-10 million, the sales of PS4+XB1 are trending > 50% greater than PS3+Xbox 360:
The PlayStation 3 and Xbox 360 in their first 20 months sold a combined 24.23 million units, while the PlayStation 4 and Xbox One have sold a combined 37.34 million units.
Total Combined PlayStation 3 and Xbox 360 Sales: 24,229,386
Total Combined PlayStation 4 and Xbox One Sales: 37,342,738 (+54%)***
***Now I am not trying to start a PC vs. console thread but just using this as a point that there is growth in the gaming industry as far as unit sales are concerned which means there are still gamers/consumers interested in gaming as a whole but what are the possible explanations as to why the desktop discrete GPU market is getting wiped out this badly?
I have some off the top of my head and feel free to add any others you feel are more valid:
1) The average PC gamer is delaying his/her GPU upgrades and thus the average GPU replacement period/upgrade cycle has increased. Perhaps that means the gamer who is now forced to spend $350-600 on a solid GPU is thinking that for his/her next GPU upgrade they want a bigger increase in performance, while in the past when GPUs cost less on average, and node shrinks were more regular, they were more likely to upgrade since the leaps in price/performance were much greater. As proof for this point, GTX960 is the worst x60 series card and the worst x60 price/performance from NV in the last 5 generations.
I think when gamers see that they are only getting 15-25% increase in performance in their price bracket after 1.5-2 years, they are more likely to hold off for the next generation. I think that's what's happening because in the past we are used to getting 50-100% increases in price/performance every 2-3 years or even every generation.
2) PC gamers are perfectly OK with older GPUs and turning some settings down in modern games. Why would that be the case? Well perhaps a lot of PC gamers feel that the Uber/Ultra/Extreme/VHQ settings are simply not worth the extra expense on a GPU/upgrading more frequently. I can see how this is a legitimate reason since many AAA games look nearly as good with 2-3 settings turned down that can net 10-20-30 fps increase in performance. Add to that filters like TXAA or MLAA or FXAA that allow a much lower performance hit than traditional MSAA methods and we can now achieve good image quality which allows an existing GPU owner to stretch his GPU ownership cycle by yet another 1-2 years.
3) The average PC gamer has a large backlog of PC games and thus doesn't necessarily need a modern cutting edge 2014-2015 GPU to play older games. In the past I feel many of us upgraded every 2-3 years like clock work, if not sooner. Today, it's not unusual to see PC gamers using GPUs that are 4, 5, or even 6 years old. Let's just say if the average PC gamer is closer to 35 years old, this individual has a full-time job and family and other life commitments. All of this means less time for PC gaming than when he/she were much younger. As a result a large backlog of PC gamers starts building up and there is less need to buy the latest AAA $60 PC game when one has to catch up to a backlog of so many missed PC games. But this also means less and less immediate need to own a cutting edge modern GPU.
4) Maybe many PC gamers are delaying their major upgrades and trying to time their upgrades with more affordable 4K monitor prices, more varieties of GSync/FreeSync monitors, or decided to simply skip this last 28nm generation for 14/16nm HBM2 GPUs? Still though, a lot of these factors are too cutting edge/specific for the majority of PC gamers so I think these have a minor impact.
5) The average PC gamer doesn't see much value in upgrading as most modern AAA PC gamers are just PS4/XB1 console ports with slightly better graphics.
6) Maybe many PC gamers feel there just haven't been many amazing PC gamers worth upgrading for?
Now most people on a PC forum like ours may be offended by some of these points (esp. #5) but we have to look at the reality and the reality shows that the desktop discrete GPU market in terms of unit sales/PC attach rates is in the worst state in the last 10 years!
I am interested to see in what other people's thoughts are on why the desktop discrete GPU market has declined so dramatically in recent years? Please share your opinions.
"Shipments of discrete graphics adapters for desktops dropped to 9.4 million units in Q2 2015, which is minimum amount in more than ten years. According to JPR, sales of graphics cards dropped 16.81 per cent compared to the previous quarter, whereas sales of desktop PCs decreased 14.77 per cent. The attach rate of add-in graphics boards (AIBs) to desktop PCs has declined from a high of 63 per cent in Q1 2008 to 37 per cent this quarter. Average sales of graphics cards have been around 15 million units per quarter in the recent years, but declined sharply in 2014."
![jpr_aib_q2_2015.png](/proxy.php?image=http%3A%2F%2Fwww.kitguru.net%2Fwp-content%2Fuploads%2F2015%2F08%2Fjpr_aib_q2_2015.png&hash=841cbbb8f8127d2298102332d1af4842)
Remember, this article is discussing Volume/Unit sales not profits or revenues.
----------
Nvidia's quarterly record profits and revenues in the graphics industry have continued to hide the ugly truth of desktop discrete market segment as a whole -- desktop dGPU market is declining at a rapid pace with no end in sight to this alarming trend.
It's not a surprise that NV/AMD are becoming forced to raise prices, and as a result we've already seen the new upper mid-range GPUs are now priced deeply in the $300-475 range (i.e., 290X/GTX970/390/390X/980), while the bare minimum new graphics card entry from NV, the GTX950 is priced at $159. The current state of the desktop discrete GPU unit sales may act as a barometer for a grim future. When NV/AMD cannot sell high enough volumes of GPUs to generate sufficient profits, they will be forced to raise prices on a per mm2/per die size and per specific grade level of a GPU (for example, what was once viable to sell at $150 now has to sell at $250). However, doing so means that less and less gamers are enticed to buy GPUs as they become less affordable. As volumes fall, NV/AMD are even more pressured to raise prices to justify the R&D and manufacturing costs, which in turn causes a vicious cycle of rising Average Selling Prices and falling demand.
Additionally, given the current state of world markets, as the USD strengthens against other world currencies, especially those in emerging markets and 3rd world countries, goods that sell in USD (i.e., NV/AMD graphics cards) will rise in prices relative to the wages/earnings power of gamers in non-USD earning countries. This will contribute to lower demand for desktop discrete graphics cards as well because that will also contribute to the lower affordability factor for GPUs. For example, in Canada cards like Fury X or GTX980Ti are approaching $1000 Canadian after taxes, and even bare minimum like 950 sells for $220+, and R9 390 is $430+ tax. Now while some hardcore PC gamers think this is fine, it seems the rest of the market doesn't agree.
(Just my opinion of course).
:'(
P.S. Now one could try to make the argument that there are less gamers worldwide playing games overall but while the desktop discrete graphics card market declined from a stable 15-20 million sales per quarter to sub-10 million, the sales of PS4+XB1 are trending > 50% greater than PS3+Xbox 360:
The PlayStation 3 and Xbox 360 in their first 20 months sold a combined 24.23 million units, while the PlayStation 4 and Xbox One have sold a combined 37.34 million units.
Total Combined PlayStation 3 and Xbox 360 Sales: 24,229,386
Total Combined PlayStation 4 and Xbox One Sales: 37,342,738 (+54%)***
***Now I am not trying to start a PC vs. console thread but just using this as a point that there is growth in the gaming industry as far as unit sales are concerned which means there are still gamers/consumers interested in gaming as a whole but what are the possible explanations as to why the desktop discrete GPU market is getting wiped out this badly?
I have some off the top of my head and feel free to add any others you feel are more valid:
1) The average PC gamer is delaying his/her GPU upgrades and thus the average GPU replacement period/upgrade cycle has increased. Perhaps that means the gamer who is now forced to spend $350-600 on a solid GPU is thinking that for his/her next GPU upgrade they want a bigger increase in performance, while in the past when GPUs cost less on average, and node shrinks were more regular, they were more likely to upgrade since the leaps in price/performance were much greater. As proof for this point, GTX960 is the worst x60 series card and the worst x60 price/performance from NV in the last 5 generations.
I think when gamers see that they are only getting 15-25% increase in performance in their price bracket after 1.5-2 years, they are more likely to hold off for the next generation. I think that's what's happening because in the past we are used to getting 50-100% increases in price/performance every 2-3 years or even every generation.
2) PC gamers are perfectly OK with older GPUs and turning some settings down in modern games. Why would that be the case? Well perhaps a lot of PC gamers feel that the Uber/Ultra/Extreme/VHQ settings are simply not worth the extra expense on a GPU/upgrading more frequently. I can see how this is a legitimate reason since many AAA games look nearly as good with 2-3 settings turned down that can net 10-20-30 fps increase in performance. Add to that filters like TXAA or MLAA or FXAA that allow a much lower performance hit than traditional MSAA methods and we can now achieve good image quality which allows an existing GPU owner to stretch his GPU ownership cycle by yet another 1-2 years.
3) The average PC gamer has a large backlog of PC games and thus doesn't necessarily need a modern cutting edge 2014-2015 GPU to play older games. In the past I feel many of us upgraded every 2-3 years like clock work, if not sooner. Today, it's not unusual to see PC gamers using GPUs that are 4, 5, or even 6 years old. Let's just say if the average PC gamer is closer to 35 years old, this individual has a full-time job and family and other life commitments. All of this means less time for PC gaming than when he/she were much younger. As a result a large backlog of PC gamers starts building up and there is less need to buy the latest AAA $60 PC game when one has to catch up to a backlog of so many missed PC games. But this also means less and less immediate need to own a cutting edge modern GPU.
4) Maybe many PC gamers are delaying their major upgrades and trying to time their upgrades with more affordable 4K monitor prices, more varieties of GSync/FreeSync monitors, or decided to simply skip this last 28nm generation for 14/16nm HBM2 GPUs? Still though, a lot of these factors are too cutting edge/specific for the majority of PC gamers so I think these have a minor impact.
5) The average PC gamer doesn't see much value in upgrading as most modern AAA PC gamers are just PS4/XB1 console ports with slightly better graphics.
6) Maybe many PC gamers feel there just haven't been many amazing PC gamers worth upgrading for?
Now most people on a PC forum like ours may be offended by some of these points (esp. #5) but we have to look at the reality and the reality shows that the desktop discrete GPU market in terms of unit sales/PC attach rates is in the worst state in the last 10 years!
I am interested to see in what other people's thoughts are on why the desktop discrete GPU market has declined so dramatically in recent years? Please share your opinions.
Last edited: