[KitGuru] Sales of desktop graphics cards hit 10-year low in Q2 2015

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Sales of desktop graphics cards hit 10-year low in Q2 2015

"Shipments of discrete graphics adapters for desktops dropped to 9.4 million units in Q2 2015, which is minimum amount in more than ten years. According to JPR, sales of graphics cards dropped 16.81 per cent compared to the previous quarter, whereas sales of desktop PCs decreased 14.77 per cent. The attach rate of add-in graphics boards (AIBs) to desktop PCs has declined from a high of 63 per cent in Q1 2008 to 37 per cent this quarter. Average sales of graphics cards have been around 15 million units per quarter in the recent years, but declined sharply in 2014."

jpr_aib_q2_2015.png



Remember, this article is discussing Volume/Unit sales not profits or revenues.
----------

Nvidia's quarterly record profits and revenues in the graphics industry have continued to hide the ugly truth of desktop discrete market segment as a whole -- desktop dGPU market is declining at a rapid pace with no end in sight to this alarming trend.

It's not a surprise that NV/AMD are becoming forced to raise prices, and as a result we've already seen the new upper mid-range GPUs are now priced deeply in the $300-475 range (i.e., 290X/GTX970/390/390X/980), while the bare minimum new graphics card entry from NV, the GTX950 is priced at $159. The current state of the desktop discrete GPU unit sales may act as a barometer for a grim future. When NV/AMD cannot sell high enough volumes of GPUs to generate sufficient profits, they will be forced to raise prices on a per mm2/per die size and per specific grade level of a GPU (for example, what was once viable to sell at $150 now has to sell at $250). However, doing so means that less and less gamers are enticed to buy GPUs as they become less affordable. As volumes fall, NV/AMD are even more pressured to raise prices to justify the R&D and manufacturing costs, which in turn causes a vicious cycle of rising Average Selling Prices and falling demand.

Additionally, given the current state of world markets, as the USD strengthens against other world currencies, especially those in emerging markets and 3rd world countries, goods that sell in USD (i.e., NV/AMD graphics cards) will rise in prices relative to the wages/earnings power of gamers in non-USD earning countries. This will contribute to lower demand for desktop discrete graphics cards as well because that will also contribute to the lower affordability factor for GPUs. For example, in Canada cards like Fury X or GTX980Ti are approaching $1000 Canadian after taxes, and even bare minimum like 950 sells for $220+, and R9 390 is $430+ tax. Now while some hardcore PC gamers think this is fine, it seems the rest of the market doesn't agree.

(Just my opinion of course).

:'(

P.S. Now one could try to make the argument that there are less gamers worldwide playing games overall but while the desktop discrete graphics card market declined from a stable 15-20 million sales per quarter to sub-10 million, the sales of PS4+XB1 are trending > 50% greater than PS3+Xbox 360:

The PlayStation 3 and Xbox 360 in their first 20 months sold a combined 24.23 million units, while the PlayStation 4 and Xbox One have sold a combined 37.34 million units.
Total Combined PlayStation 3 and Xbox 360 Sales: 24,229,386
Total Combined PlayStation 4 and Xbox One Sales: 37,342,738 (+54%)***

***Now I am not trying to start a PC vs. console thread but just using this as a point that there is growth in the gaming industry as far as unit sales are concerned which means there are still gamers/consumers interested in gaming as a whole but what are the possible explanations as to why the desktop discrete GPU market is getting wiped out this badly?

I have some off the top of my head and feel free to add any others you feel are more valid:

1) The average PC gamer is delaying his/her GPU upgrades and thus the average GPU replacement period/upgrade cycle has increased. Perhaps that means the gamer who is now forced to spend $350-600 on a solid GPU is thinking that for his/her next GPU upgrade they want a bigger increase in performance, while in the past when GPUs cost less on average, and node shrinks were more regular, they were more likely to upgrade since the leaps in price/performance were much greater. As proof for this point, GTX960 is the worst x60 series card and the worst x60 price/performance from NV in the last 5 generations.

I think when gamers see that they are only getting 15-25% increase in performance in their price bracket after 1.5-2 years, they are more likely to hold off for the next generation. I think that's what's happening because in the past we are used to getting 50-100% increases in price/performance every 2-3 years or even every generation.

2) PC gamers are perfectly OK with older GPUs and turning some settings down in modern games. Why would that be the case? Well perhaps a lot of PC gamers feel that the Uber/Ultra/Extreme/VHQ settings are simply not worth the extra expense on a GPU/upgrading more frequently. I can see how this is a legitimate reason since many AAA games look nearly as good with 2-3 settings turned down that can net 10-20-30 fps increase in performance. Add to that filters like TXAA or MLAA or FXAA that allow a much lower performance hit than traditional MSAA methods and we can now achieve good image quality which allows an existing GPU owner to stretch his GPU ownership cycle by yet another 1-2 years.

3) The average PC gamer has a large backlog of PC games and thus doesn't necessarily need a modern cutting edge 2014-2015 GPU to play older games. In the past I feel many of us upgraded every 2-3 years like clock work, if not sooner. Today, it's not unusual to see PC gamers using GPUs that are 4, 5, or even 6 years old. Let's just say if the average PC gamer is closer to 35 years old, this individual has a full-time job and family and other life commitments. All of this means less time for PC gaming than when he/she were much younger. As a result a large backlog of PC gamers starts building up and there is less need to buy the latest AAA $60 PC game when one has to catch up to a backlog of so many missed PC games. But this also means less and less immediate need to own a cutting edge modern GPU.

4) Maybe many PC gamers are delaying their major upgrades and trying to time their upgrades with more affordable 4K monitor prices, more varieties of GSync/FreeSync monitors, or decided to simply skip this last 28nm generation for 14/16nm HBM2 GPUs? Still though, a lot of these factors are too cutting edge/specific for the majority of PC gamers so I think these have a minor impact.

5) The average PC gamer doesn't see much value in upgrading as most modern AAA PC gamers are just PS4/XB1 console ports with slightly better graphics.

6) Maybe many PC gamers feel there just haven't been many amazing PC gamers worth upgrading for?

Now most people on a PC forum like ours may be offended by some of these points (esp. #5) but we have to look at the reality and the reality shows that the desktop discrete GPU market in terms of unit sales/PC attach rates is in the worst state in the last 10 years!

I am interested to see in what other people's thoughts are on why the desktop discrete GPU market has declined so dramatically in recent years? Please share your opinions.
 
Last edited:

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
6) A lot of very casual PC gamers used to buy entry level GPUs to play any games at all. This was a larger portion of GPU sales than people give credit for. Now their Intel integrated graphics run everything they want anyway.
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
6) A lot of very casual PC gamers used to buy entry level GPUs to play any games at all. This was a larger portion of GPU sales than people give credit for. Now their Intel integrated graphics run everything they want anyway.

This plus the fact that current generations suck, too expensive or to little performance from both AMD and NV.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
You just edited the post 1 minute ago to include a frowny face in the title.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
6) A lot of very casual PC gamers used to buy entry level GPUs to play any games at all. This was a larger portion of GPU sales than people give credit for. Now their Intel integrated graphics run everything they want anyway.

Ya, that's a good point. That also means someone with an HD5850->6970 level card or GTX460->580 level card could easily play some of the popular MOBA titles. Looking at where cards like HD6970/GTX580 sit, cards like GTX750Ti/950 aren't that much better to even warrant spending $100-160 for. The elimination of the sub-$99 discrete GPU market probably also has a lot to do with it too because you can hardly buy any good gaming GPU for $100 today. Also, some of the older generations had incredible bargains/sleepers like GeForce 4 Ti 4200 or 8800GT or HD5850 or GTX560Ti/HD6950.

Today, it's impossible to buy a $200-250 GPU and get 85-90% of the performance of the flagship part. In the past, this wasn't unusual. On average the great desktop discrete GPUs are simply more expensive imo, which forces people to wait longer between upgrades (i.e., if I am spending $650+ USD on a new GPU which for me is now almost $1000 Canadian (!), I want 2-3X the performance increase, while in the past I was happy with a 40-50% increase).

You just edited the post 1 minute ago to include a frowny face in the title.

Ya, cuz the news made me sad. You want me to remove it? :)
 
Last edited:

Rannar

Member
Aug 12, 2015
52
14
81
This plus the fact that current generations suck, too expensive or to little performance from both AMD and NV.

Same problem on software side. No "next gen" games that really push something new.
Same old stuff just a bit more shiny.
 

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
A lack of progress due to being stuck at the 28nm node hasn't helped either. If TSMC had their 20nm Finfet (excuse me, I mean 16nm) ready say in 2014 instead of 2016/2017 we'd already have mainstream cards capable of 4K 60fps. The display industries' hype would be able to get more traction then.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
This plus the fact that current generations suck, too expensive or to little performance from both AMD and NV.

Ya, I think all of these factors contribute to it.

GTX280 -> GTX580 was > 80% the performance increase (pretty much almost any 480 could hit 580 speeds too).

perfrel_2560.gif


GTX580->780Ti was > 2X the performance increase.

perfrel_2560.gif


780Ti -> 980Ti is only about a 40% increase.

perfrel_2560.gif


But the increases in the lower ranks this generation are even worse.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
6) A lot of very casual PC gamers used to buy entry level GPUs to play any games at all. This was a larger portion of GPU sales than people give credit for. Now their Intel integrated graphics run everything they want anyway.

I was looking at NCIX to see if there's a cheap video card to buy for an upgrade to my HD 3000 integrated graphics and also have fun with overclocking.

There used to be a time where spending $70-80 on a video card brought 4-5x over integrated at the same generation. Now, Sandy Bridge's iGPU is 4 years old. The gain is about 4-5x, which is good, but rather surprised at how little of a gain it was. I knew that iGPUs in general improved a lot overall of course.

Of course, I was looking at the "AAA" games available. I used to be a gamer since high school days, but basically stopped few years ago and there's nothing interesting out there. All they seem to do with all the graphics is have more realistic gore and killing.
 

NTMBK

Lifer
Nov 14, 2011
10,232
5,012
136
I'm part of the problem, still happy with my HD7770. Waiting for 14nm and HBM2 to reach the mid to low end before I even consider upgrading.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
I'm part of the problem, still happy with my HD7770. Waiting for 14nm and HBM2 to reach the mid to low end before I even consider upgrading.
Yeah I'm pretty happy with my 7750 as well because i only game at 1600x900.
Been using the card since 2012. Gtx950 would have been a great upgrade if it was like $120 or something.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Was already mentioned in the graphics share thread 3 days ago.

But it shouldnt be any surprise for anyone that the discrete GPU is dying, and dying fast.

Look on Intels new 5x5 platform for example, thats the future desktop gaming rig type.

And no, PC gaming isnt dying, its very healthy.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
Was already mentioned in the graphics share thread 3 days ago.

But it shouldnt be any surprise for anyone that the discrete GPU is dying, and dying fast.

Look on Intels new 5x5 platform for example, thats the future desktop gaming rig type.

And no, PC gaming isnt dying, its very healthy.

DGPU can never die neither for AMD or Nvidia because they have many years of driver optimization, architecture and performance behind it.

Main reason of poor sale is 28nm .
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
With mainstream monitors stuck at 1080p for over 10 years, onboard video has gotten good enough to cannibalize the low end which constitutes the highest volume of sales. The high end probably hasn't changed nearly as much as the overall market has.
 

bononos

Diamond Member
Aug 21, 2011
3,886
156
106
......
I am interested to see in what other people's thoughts are on why the desktop discrete GPU market has declined so dramatically in recent years? Please share your opinions.

I think all of your points are valid but the most important reason would probably the first reply (post 2). PCs in the past needed a $50-100 video card to work which meant that the majority of computer buyers who weren't gamers at all were also contributing lots of revenue to the discrete card market. But now casual computer users can just use integrated graphics for their needs without having to shell out for a graphics card, or more likely buy a laptop/tablet.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
You mean sub $100 dGPU volumes are decreasing.

If you mean iGPU yes, for high-end Gaming systems no.

Only nVidia sells more higher end GPUs in revenue offset.

No matter if we like it or not, the discrete GPU is dead. Its all about ROI in the end. And we will all play games on IGPs in the future. the only question is when, not if.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
7) Most gamers don't care about 4k, and are quite happy at 1920 or less. Equally they don't care about running everything maxed out, med-high at decent fps is fine. This means they don't need the latest and greatest cards.
8) The cpu market has completely stagnated with nothing much exciting happening since sandy bridge. Hence people are buying new pc's less often.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
Only nVidia sells more higher end GPUs in revenue offset.

No matter if we like it or not, the discrete GPU is dead. Its all about ROI in the end. And we will all play games on IGPs in the future. the only question is when, not if.

Next year 16nm FF High-end dGPUs will have 2x the transistor count than GM200 and Fury X, that is 2x 8B = 16B transistors. Performance will increase by almost 2x, especially in future DX-12 games.
Also, DX-12 games and 4K monitors will increase the GPU performance needs substantially.

With all that, 65W-100W TDP APUs from both Intel and AMD will not be able to catch up dGPUs not in a million years.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Next year 16nm FF High-end dGPUs will have 2x the transistor count than GM200 and Fury X, that is 2x 8B = 16B transistors. Performance will increase by almost 2x, especially in future DX-12 games.
Also, DX-12 games and 4K monitors will increase the GPU performance needs substantially.

With all that, 65W-100W TDP APUs from both Intel and AMD will not be able to catch up dGPUs not in a million years.

Yes yes, next year, next thing will fix everything. It never does.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
Yes yes, next year, next thing will fix everything. It never does.

This is not about fixing anything, this is about where the market is going.

DX-12 games will be here from 2016 onward, they will increase the CPU/GPU performance needs.
4K monitors will become mainstream the next 2-4 years. They will increase the GPU performance needs.
Then there is VR, another technology to drive CPU/GPU performance needs even higher.

If you believe PC future games performance needs will stay the same as today you clearly mistaken.
The more performance you get, a new technology will be created to take advantage of it.
 

Ares202

Senior member
Jun 3, 2007
331
0
71
With all that, 65W-100W TDP APUs from both Intel and AMD will not be able to catch up dGPUs not in a million years.

But in time those GPU's will be sufficient for the 90+% and most games will follow suit in tailoring for those APU's, actual graphical quality increase will be minimal for those with discrete GPU's

I can still see Nvidia surviving for the 5-10% but only releasing 2-3 high end cards in a 2+ year release cycle.

In terms of the original question, definitely point 1, and 2 apply to me. I will be upgrading when 14nm GPU's are released.
 

Redentor

Member
Apr 2, 2005
97
14
71
Why do I have to buy an expensive PC, when all PC games are console porting? Just to have a better graphic? LOL