[KitGuru] Sales of desktop graphics cards hit 10-year low in Q2 2015

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SPBHM

Diamond Member
Sep 12, 2012
5,054
407
126
I think performance from +- $200 cards haven't changed enough since Q1 2012
so people are using the same cards for longer, and Intel IGPs are viable for a lot more game than they used to...

new gen of consoles and good gaming with smartphones/tablets will also not help
 

Ranulf

Platinum Member
Jul 18, 2001
2,294
1,080
136
A

What keeps PC gaming around if the costs of the hardware are going up? Unionization of buyer power via Steam? Fortune smiled when the consoles went x86? A locked in master race market who refuses to console game? I really wonder.

If MS and Sony were smart they would let keyboards and mice to be used with their consoles for gaming. I might actually get an xbox1 if they did. I broke down and bought a converter box thing that works with the 360 to play halo with a KB/mouse.
 

ZipSpeed

Golden Member
Aug 13, 2007
1,302
169
106
I used to buy graphics card on an annual basis. The 7950 in my rig is the longest I've gone without upgrading. And I don't see myself upgrading just yet either. Still runs decent at medium to high settings, depending on game.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I think one problem is that AMD reduced its presence in the low end of the market, leaving much of it to Nvidia.

Not sure why this is? Some have speculated that a reduction in Oland production was to help prop up demand for the desktop APUs. (re: having a high volume Oland would have contributed to erosion of Kaveri desktop sales)

If so, that is a shame because IMO there are so many worthy used SFF desktops (Sandy Bridge hardware,etc.) that could use a nice low profile card in the 25W to 40W range.

P.S. Right now, the best sub 40W card appears to be the GT 730 GDDR5 (GK208 with 64 bit GDDR5). The PNY version usually sells for $55 after rebate w/free ship, but at one time GK208 with 64 bit GDDR5 dropped as low as $40 AR (during the time it was called GT 640 GDDR5). I would like to see AMD offer something competitive in this area to give SFF gamers more options.

^^^^ Regarding the above,

Maybe some kind of R7 Nano or R7 Nano lite based on Bonaire? 768 or 896 stream processors @ 40 watts with 128 bit GDDR5 (perhaps lower clocked/lower voltage GDDR5)....that would definitely fit the bill for a lot of capable SFF Pre-builts.

Another thing to consider for a 40W card is the upcoming Thunderbolt III for laptops and UCFF desktops. Unlike the predecessor this one works with usb-c cables (and provides twice the bandwidth of usb-c using the same passive cable).

EDIT: Besides the gaming, there is the OPEN CL to think of as well. Lots of good CPU out there that could use a boost from a 40W dGPU (for select applications).
 
Last edited:

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
When NV/AMD cannot sell high enough volumes of GPUs to generate sufficient profits, they will be forced to raise prices on a per mm2/per die size and per specific grade level of a GPU (for example, what was once viable to sell at $150 now has to sell at $250). However, doing so means that less and less gamers are enticed to buy GPUs as they become less affordable. As volumes fall, NV/AMD are even more pressured to raise prices to justify the R&D and manufacturing costs, which in turn causes a vicious cycle of rising Average Selling Prices and falling demand.

*snip*

I am interested to see in what other people's thoughts are on why the desktop discrete GPU market has declined so dramatically in recent years? Please share your opinions.

tl;dr for anyone who doesn't want to read a wall of text. Gamers are moving from PC to console but the console market leeches from the success and R&D of the PC market so long term diminishing numbers in the PC market is unsustainable for the entire gaming ecosystem.

Long version - This is all because of the mass adoption of cheap consoles, they have a much slower generation of 6-8 years when in that same space we have 3-4 generations of video cards, CPUs, memory types etc.

Consoles used to be their own discreet engineering projects with their own specialized hardware, but slowly over time this because less the case and they have trended towards becoming more like PCs, the last few generations have seen the consoles go directly to Nvidia and AMD for their GPU and even adopt DirectX as a standard.

All of this stuff exists because of the dGPU market and the PC gaming market, the reason that 8 years ago we had a PS3 and now 8 years later we have the PS4 and they see a x16 increase in hardware speed is because between those releases the PC market has had an agressive 18-24 month generational cycle where gamers are constantly upgrading and paying for the aggressive R&D which gives us this constant growth, it's expensive but hey - we want that power because we love graphics and gaming.

Also in that time we see improvements in DirectX and rendering standards, they're mostly coming from whitepapers on new features written and implemented on PCs, and then integrated into PC gaming engines like Unreal engine, and CryEngine. Remember that even these engines were PC exclusives not all that long ago, but now they've adapted to be multi-platform, so again consoles can lag behind the times and benefit from the improvement in tech more or less for free, because the R&D was paid for by the PC gaming community.

My longwinded point here is that the core of progress and development doesn't come from consoles, it comes form PCs, but the console platforms are the biggest and most mainstream and increasingly they're pulling in gamers who might have otherwise been PC gamers and invested money there instead. The reason is largely down to price, the initial buy in cost of consoles is MUCH cheaper or at least perceived that way and of course anyone with any knowledge knows that their business model is cheap buy in with high consumables cost (the games) by charging royalties etc.

In the long run this is all one large relative balancing act, the console market EXPLICITLY relies on the success of the PC market because between console generations we do 8 years of constant R&D which is prohibitively expensive, MS and Sony cannot shoulder that cost themselves, they're simply buying hardware at long intervals from a market who's R&D costs are offset not on the console GPU sales by largly on PC sales, by people who pay premiums for top of the range hardware, those premiums don't go into Nvidia and AMDs pocket they most go to offset the insane R&D they have to do.

Consider for a second what might happen if say Nvidia and AMD shut down their R&D for GPUs, exited the PC dGPU market outside of dGPUs for business and a few specialist ones for CAD. The demand on chips from TSMC and other providers plummet and the whole market segment grinds to a halt. Meanwhile 8 years later MS and Sony want to re-vamp their consoles because sales are low so what do they do? Well they have to pay to do R&D from scratch, they need to invest in the 8 missing years of hardware development, or they face releasing a next gen console with barely any new horsepower. What does that do to the prices to the hardware? Well that shoves the prices through the roof and that cost is passed onto the consumers, the console gamers.

R&D is super expensive and someone has to pay for it, it's the people on the bleeding edge pay for it, right now consoles are favored for being cheap and ubiquitous, they lose that edge the moment the PC market dies because they become the bleeding edge and have to finally pay for their own R&D.

To add to all of that we've had a relative slump in progress for a while with dGPUs, stuck on the 28nm node for way too long, that is a blip I feel will be shortly over, the next Gen of Nvidia tech using FinFETs sounds amazing and potentially an insane leap in power. If anything can refresh the market is really coming back around to this more aggressive 2x performance between gens that we were closer to around the 8800 GTX era and prior.
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
Ya, that's a good point. That also means someone with an HD5850->6970 level card or GTX460->580 level card could easily play some of the popular MOBA titles.

I sold my 670s off and bought a used 980 - during that period (USPS failed and I was without a newer card for a few days) I shoved an ancient 580 in my PC. I didn't run off and play Crysis or Metro Redux but I was able to do light gaming. The card ran pretty warm and all that...but it DID work. If I was willing to sit still at 1080p, I'd have kept my 670s...and most gamers are at 1080p, playing MOBAs and such. I can play MOBAs on an iGPU in my macbook for crying out loud. Same with stuff like WOW, right?
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
tl;dr for anyone who doesn't want to read a wall of text...

Heheh that was a clever post, you tricked me!

Maybe another factor here is that buying used on Ebay and Craigslist is more acceptable and easy. Even just a few years ago, people might not take advantage of used cards, but it's so easy now, why not?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Its amazing you already know Skylake GT4e performance. Please link it.

Why is that so hard? Broadwell's Iris Pro 6200 is shown in one review to be slightly behind GT 740. GTX 750 is ~50% faster than GT 740. Skylake GT4e is said to be "up to 50%" faster. At best, we might have it equal to a GTX 750. Notice that its not even up to a Ti, which is 20% faster in addition. 950 is 40% on top of that. By the time SKL GT4e is out, it'll be x40 level. Intel is process-constrained like everyone else.

Next year 16nm FF High-end dGPUs will have 2x the transistor count than GM200 and Fury X, that is 2x 8B = 16B transistors
Watch as the 16/14nm GPUs turn out to be 30-50% faster than 28nm. You know the most disappointing process generations turned out to be 20nm and 14nm, Intel included. Only thing that got better with fancy sounding technologies like "FinFET" and "Tri-Gate" is the powerpoint presentations they are using to promote them.
 
Last edited:

DustinBrowder

Member
Jul 22, 2015
114
1
0
PC Gaming is MUCH bigger than console gaming these days. In fact if you just take into account WOW, Sims 2-4, LOL, Dota2, Hearthstone, Diablo 3 and WOT you have more people playing than there are consoles sold.

LOL has over 50 million annual users, Dota 2 has over 25 million annual users, Hearthsote has over 40 million, WOW has 6-9 million each month, Diablo 3 has sold 10 million copies Sims 2-4 sold over 40 million copies and WOT has over 15 million annual users!

So just between those games there are over 150 million users.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
More chicken little sky is falling blah blah blah..

There's always going to be a market for discrete GPUs, whether because of hardware enthusiasts like us that love building and tweaking new PCs, or because PC gaming becomes more and more cutting edge which requires more powerful hardware.

Take 4K for example. How long is it going to be before any integrated GPU, or even an APU can play games at 4K resolution, much less with all the bells and whistles?

4K is going to be the next big thing that drives hardware adoption for discrete GPUs. The 4K market has been slow to uptick, because the hardware just isn't there yet. But next year it will be.

I for one have been delaying my transition to 4K, because to do so now would mean greater investment in SLi than I've already done. Single GPUs on 28nm just don't have the resources to make a good showing out of 4K without lowering the settings..

Next year's 14nm and 16nm parts from AMD and NVidia will change that :thumbsup: I plan on dumping SLi for good, and moving over to the fastest single GPU I can buy, like a Titan XI or whatever it will be called..
 

tential

Diamond Member
May 13, 2008
7,355
642
121
I truly think there is a LARGE glut of people holding onto their cards waiting for a massive jump in performance. They haven't seen it, so they're holding on to their current cards rather than buy into a small increase like we enthusiasts would.

Arctic Islands and Pascal should change that and get people excited to upgrade to a brand new super fast chip. I really can't imagine those 2 chips won't be a massive improvement over what we currently have.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I truly think there is a LARGE glut of people holding onto their cards waiting for a massive jump in performance. They haven't seen it, so they're holding on to their current cards rather than buy into a small increase like we enthusiasts would.

And the flipside of that is all the folks with SFF Pre-built desktops (and soon Thunderbolt III laptops and UCFF desktops) that would like to see something with more performance as well.

However, the bright side of this is that the dGPU makers still have plenty of room to make a really strong improvement in the lower watt desktop dGPU segment. (All they have to do is take one of many GPU dies and just run it at lower clocks for better performance per watt)

In contrast, the highest end is pretty much tapped out (They are at the largest size die and very high power already).
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
This is expected to me because the software(games) hasn't necessitated an upgrade in quite some time. Lots of sidegrades & rebadges don't help.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126

Maybe not but I think consoles have seen a large group of PC gamers jump in anyway. Not moving from PC, but adding to the library. I know I have bought many games on Console instead of PC for various reasons and vice versa. When GTA 5 took as long as it did to release on PC, games release in the state of Arkham Knight, and the various exclusives, I don't blame people if they want to also have a PS4 or Xbox One.
 
Last edited:

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
Why is that so hard? Broadwell's Iris Pro 6200 is shown in one review to be slightly behind GT 740. GTX 750 is ~50% faster than GT 740. Skylake GT4e is said to be "up to 50%" faster. At best, we might have it equal to a GTX 750. Notice that its not even up to a Ti, which is 20% faster in addition. 950 is 40% on top of that. By the time SKL GT4e is out, it'll be x40 level. Intel is process-constrained like everyone else.

Raw horsepower is just one part of the equation. How are Intel drivers for games? From what I hear they're pretty atrocious.
 

Maximilian

Lifer
Feb 8, 2004
12,603
9
81
Maybe not but I think consoles have seen a large group of PC gamers jump in anyway. Not moving from PC, but adding to the library. I know I have bought many games on Console instead of PC for various reasons and vice versa. When GTA 5 took as long as it did to release on PC, games release in the state of Arkham Knight, and the various exclusives, I don't blame people if they want to also have a PS4 or Xbox One.

Maybe but I think people are seriously underestimating the free to play games on the PC and their popularity.

Playing exclusively AAA titles from big company's is a thing of the past IMO.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Maybe but I think people are seriously underestimating the free to play games on the PC and their popularity.

Playing exclusively AAA titles from big company's is a thing of the past IMO.

Those free to play games aren't for everyone though. Their popularity doesn't matter to me. I'm just saying that I don't think everyone is exclusively PC players of exclusively console players.
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
Those free to play games aren't for everyone though. Their popularity doesn't matter to me. I'm just saying that I don't think everyone is exclusively PC players of exclusively console players.

With the amount that Steam pushes the free to play titles for me (including Tom Clancy's Ghost Recon F2P game????) there seem to be a lot of them....and they seem pretty popular. I have zero interest in them since most of them are pay to win, but they seem popular regardless.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
With the amount that Steam pushes the free to play titles for me (including Tom Clancy's Ghost Recon F2P game????) there seem to be a lot of them....and they seem pretty popular. I have zero interest in them since most of them are pay to win, but they seem popular regardless.

Right but to me (and you apparently) they're not worth looking at. I'm sure we aren't the only ones. There's probably more people who play games and don't touch F2P titles than those who do.

My point was only that there are probably a good number of people who don't buy everything on the PC and don't limit themselves to only PC games. People who take this approach might not feel they need to be on the latest GPU. There's many on this forum who are in this category I'm sure.
 

Magee_MC

Senior member
Jan 18, 2010
217
13
81
I fit into almost all 6 of RS's categories.

I have both consoles (PS3, 360, PS4, XBOne) as well as 2 older gaming computers. The oldest is a Phenom II X6 1055T with a 5870 and a newer HTPC/gaming rig I use on my HDTV which has a 2500K and until last week had a 6870.

I just upgraded the 6870 with a $220 R9 290 because for a reasonable price I got a decent increase in power. I could have upgraded sooner, but there really wasn't a need to.

I could dial down the settings, and still get decent FPS, and I'm working on a backlog of older Steam games, that don't really call for a top tier GPU. Additionally, there hasn't been a recent AAA game that was compelling enough for me to buy it before the price drops, and since no single GPU is close to being good enough for 4K, I'm not in a rush to make the move.

When the next gen GPUs arrive, I'll take a look and see what's available, and if it makes sense I'll upgrade. Otherwise, I'll keep an eye on the sales and if I see something that makes moving up from the Phenom II X6 and 5870 I'll pull the trigger. Otherwise I'll just keep puttering along on what I have and enjoying the ride in the slow lane. :)
 

borderdeal

Member
Aug 4, 2013
132
0
0
I usually get my video cards second hand. I have been doing that for years and save $$$. CPUs I used to upgrade every year and always bought the motherboard and CPU new. I am older now and I do not feel the need to upgrade that often anymore. My current system is a 2500K running at 4.3 (it can go as high as 4.5) with 16 GB of ram and I do not see the need to upgrade maybe for a couple of years. I just upgraded to a 290 that I got here used for $150. It replaces a 7870 that I paid $90 when I bought it almost 2 years ago. With the 290 I will be set until 4k freesync monitors are affordable Probably 3-4 years. Currently playing on a 23" 1080p monitor so the 290 should last me 3+ years until I redo the whole system (most likely will upgrade CPU mobo + ram in 2-3 years and then 1-2 years later the video card). last time I bought a brand new card was the ATI 9500 that I moded into a 9700 and that is why I bought it new :) By the way last console I owned was an Atari 2600 do not own any consoles and do not plan when I can get cheap games for the PC from sooo many palces
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Raw horsepower is just one part of the equation. How are Intel drivers for games? From what I hear they're pretty atrocious.

I wouldn't go that far, but yea AMD and Nvidia drivers are lot better. Really sometimes I think their goal with iGPUs is same as it was with HD Audio, acceptable solution for a vast majority of people. The biggest clue being that their graphics driver UI is absolutely plain. AMD/Nvidia UIs are feature rich.
 

biostud

Lifer
Feb 27, 2003
18,092
4,529
136
Since the iGPUs are fast enough for web/casual gaming it is no wonder that low end videocards are seeing a decline. Also the perfomance gains for launches/rebranding has been really low, so there has been no incentive to upgrade your GPU for some years. Maturation of freesync, dx12, and 16nm cards might get me interested again.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
I have no issue spending a $1K a year for a GPU but now I can't be bothered. This year how many AAA titles justify that shiny GPU? GTA V for me was a huge disappointment compared to IV, the Witcher III was OK but upgrading just for that HA!, and Dying Light was OK. That is it. I have no backlog. I play AAA games purely for the singleplayer bits (hate multiplayer never will play it) usually Day 1. Now meh. Singleplayer is dying, AAA releases are drying up and what there is is generic and meh, and as for consoles bleh. Hate playing on huge screens and hate controllers (now anyway, I still have a PS2). HATE.

Actually, that PS2 reminds me when games were FUN, and different. Still waiting for next gen Manhunt or Suffering. The Evil Within doesn't count. 2000s seemed fresh now in 2015 games just seem stale. D: