[KitGuru] Sales of desktop graphics cards hit 10-year low in Q2 2015

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
I used to buy graphics card on an annual basis. The 7950 in my rig is the longest I've gone without upgrading. And I don't see myself upgrading just yet either. Still runs decent at medium to high settings, depending on game.

I have been playing on a 7970 but in some games i have trouble running smoothly with high settings and yes i love eye candy so i thought a single 390x would be enough and it was but someone wanted it and paid full price so with that money plus some spare cash i had laying around i bought a GTX 980ti on amazon yesterday.

With that said, i agree with those blaming 28nm. We need bigger performance jumps period.
 

AtenRa

Lifer
Feb 2, 2009
13,984
3,347
136
Take 4K for example. How long is it going to be before any integrated GPU, or even an APU can play games at 4K resolution, much less with all the bells and whistles?

If they stick to 65-95W TDP for the APUs, they will need a R9 290 performance to play current games at 4K with low/medium settings.
Well i dont expect to have such performance with 14nm FF APUs even with HBM at 65-95W TDP. That leaves us with 10nm APUs around 2020 or later.
 

TheProgrammer

Member
Feb 16, 2015
58
0
0
Interesting thread. Good read for anyone who came to the last page and sees this.

My thoughts:
VR won't save or boost dGPUs. PS4 has VR coming for cheaper, easier to setup and accessible. I doubt the numbers of enthusiasts here on the PC VR side won't be enough to create a mass market.
The real hope for dGPUs are Steam Machines. If those take off in the living room, or people use those Steam Link and the numbers are big, that will give PC gaming new life and thus help dGPUs by extension.
While not a revolutionary thing to say, I expect dGPUs to continue to decline after a short lived bump upon the VR and 16nm dGPUs launches.

For me, my first card other than countless 2D machines for a couple decades, was a 3dfx Voodoo 4MB purchased in 1996. I no longer have interest in buying a new card and my current one may be my last. The only thing that will get me to upgrade again is if I buy into VR. But PC VR will be a relatively small percentage of gamers overall.

We do have a Wii U and love it, we buy new games all the time and will be buying Nintendo's next console on launch. And there's a lot untold about VR so far but the way to go there may be just a PS4 + Morpheus.

I'm weighing my options, but if I buy into PC VR I'd like to build another big rig with 8core Zen and 16nm GCN. If I don't, it makes more sense for my next build to fit my needs and life by going with a laptop with docking station hooked up to my existing LCDs. Playing whatever games that will support. I'm a heavy LoL player since 2010 so I have no problem there.

I expect AMD's HBM APUs next year to be a game changer. Intel's APUs (yeah I said it, it's just easier) will continue to improve. The ROI just won't be there for most to develop standalone external GPUs. Maybe AMD if they recover (in any way server/desktop/mobile). And maybe NV could survive too if they can find a new core market, like if Tegra took off.

Another option I'm considering other than an 8core VR setup or a laptop, is the mythical 300watt APU that AMD said they'll eventually produce.

I'm more excited to see how HBM APUs perform than any other upcoming innovation. I remember when sound cards mostly died, and I celebrated that moment. One less driver, no more Soundblaster PCI bus flooding, no more nonsense. DX12/Vulkan HBM APUs sound great to me.
The sky really is falling if you liked the old order. I did for 30 years.
I, for one, welcome our new APU overlords.
 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,205
1,579
136
Next year 16nm FF High-end dGPUs will have 2x the transistor count than GM200 and Fury X, that is 2x 8B = 16B transistors. Performance will increase by almost 2x, especially in future DX-12 games.
Also, DX-12 games and 4K monitors will increase the GPU performance needs substantially.

With all that, 65W-100W TDP APUs from both Intel and AMD will not be able to catch up dGPUs not in a million years.

I would bet a pretty huge amount we will not see the real GM200/Fury X predecessor in 2016. Not going to happen. There isn't any TSMC 16 nm product out yet and at current 20 nm they still have issues and they mostly make SOCs with like 100 mm^2 and not >500mm^2 GPUs. The later is a whole different thing. I doubt it will be possible at all and then the yields will be very crappy.

I bet the next "flagship" will again be the mainstream core like GTX 980. Will probably also have something like max +20% compared to 980 TI at much lower power usage and $600. All in all will not an amazing upgrade for any 980 TI /fury x owners. I also have my doubts this will be available in 2016. It will still be a die of roughly 300mm^2, so much, much bigger than all the phone SOCs. Given this, buying fury now probably isn't that bad of a thing. It will IMHO be mid 2017 till it will really be crushed. And i say Fury X because with recent history of NV crippling last generation GPUs Fury X will probably look a lot better in 1-2 years time compared to 980 Ti.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Good news for everybody really, DX12 is coming and hardware stagnation forces devs to actually optimize their games instead of pushing the buck to consumers to pay ever faster hardware for their incompetence.
 

AtenRa

Lifer
Feb 2, 2009
13,984
3,347
136
I would bet a pretty huge amount we will not see the real GM200/Fury X predecessor in 2016. Not going to happen. There isn't any TSMC 16 nm product out yet and at current 20 nm they still have issues and they mostly make SOCs with like 100 mm^2 and not >500mm^2 GPUs. The later is a whole different thing. I doubt it will be possible at all and then the yields will be very crappy.

I bet the next "flagship" will again be the mainstream core like GTX 980. Will probably also have something like max +20% compared to 980 TI at much lower power usage and $600. All in all will not an amazing upgrade for any 980 TI /fury x owners. I also have my doubts this will be available in 2016. It will still be a die of roughly 300mm^2, so much, much bigger than all the phone SOCs. Given this, buying fury now probably isn't that bad of a thing. It will IMHO be mid 2017 till it will really be crushed. And i say Fury X because with recent history of NV crippling last generation GPUs Fury X will probably look a lot better in 1-2 years time compared to 980 Ti.

Yea ok im not expecting 500mm2+ dies in 2016 but 350-400mm2 will be ok. Also 16nm FF has ~2x (perhaps more) the density of 28nm planar, so a 400mm2 die like GM204 on the GTX980 (5.2B transistors) will have 10-12B transistors at 16nm FF. With HBM 2.0 we can have 2x the performance (more in future DX-12 games) of GTX980 at the same die size.
I will say a 400mm2 16nm FF next year could be ~50% faster than GTX980Ti at half the power.
Now if they will go for a 300mm2 flagship die in 2016 then yes we may only see a 30% higher performance than GTX980ti but at substantially lower power (lower than half).
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
1) This new console generation was sort of "DOA" when it came challenging PC graphics. While they might not be as capable and feature-laden, old high end graphics cards are still getting the job done when most of what we get are multiplatform games. Essentially, we need another Crysis that drives hardware sales. 4K and VR are sort of killer apps, but in VRs case, it seems to be more of a software issue, not rendering horsepower.

2) The obsession over prestige GPUs overlooks the more meatier sales and market of mid range cards, the kind that are affordable to most gamers and more attractive to consolites wanting to make the jump into PC gaming.

3) Yes, the low end discrete market doesn't exist anymore with the huge strides made by Intel and AMD iGPs. The "mid-range" market has to become the new "low end".

4) 28 nm is so long in the tooth....... Once Nvidia and AMD move down to 14 or 16 nm, the performance/price ratio will get better.

5) Sort of related to "we need another Crysis": we need better developer and publisher support to drive PC game sales. It seems devs keep forgetting that the PC market means more profit per unit of software sold. Things are much better than during the last generation, but there still seems to be a heavy reluctance by the big pubs to really push the PC as the best platform.

6) The tiered releasing of a product line of video cards and rebranding is killing much of the cohesiveness and momentum of video card generations, since consumers may or may not know one card supports this or that feature despite being in the same series. Nvidia was the first offender of this, now AMD has become the worst. The next Radeon series really needs to be a fresh new line, fully DX12 compliant (DX12.1?), full 4K HDMI support, etc and hopefully it will be if it means a new process node.

Even at 14nm, I don't see Intel making a real attempt to kill off discrete graphics, when the cost of IGPs in their CPUs probably isn't making them any real profit in the desktop space with no major gains in CPU performance, just a guaranteed inclusion in applicable cheap products. Quite ironic. I wonder how things could benefit the desktop space if they slashed the IGP size to increase numbers of chips per wafer or include more cores. Unfortunately we live in a world where mobile and desktop CPUs are mostly sourced from the same fab AFAIK and binned accordingly.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Good news for everybody really, DX12 is coming and hardware stagnation forces devs to actually optimize their games instead of pushing the buck to consumers to pay ever faster hardware for their incompetence.

Devs optimize their code? Not in the new publisher_make_the_rules era. If a Publisher doesn't see a reason to fix a bug, you're gonna be searching for them community patches.

Unless that bug is a "feature." Know what I'm saying *cough*Dark Souls_720@30_render*cough*
 
Jul 26, 2006
143
2
81
1) The average PC gamer is delaying his/her GPU upgrades and thus the average GPU replacement period/upgrade cycle has increased. Perhaps that means the gamer who is now forced to spend $350-600 on a solid GPU is thinking that for his/her next GPU upgrade they want a bigger increase in performance, while in the past when GPUs cost less on average, and node shrinks were more regular, they were more likely to upgrade since the leaps in price/performance were much greater. As proof for this point, GTX960 is the worst x60 series card and the worst x60 price/performance from NV in the last 5 generations.

I think when gamers see that they are only getting 15-25% increase in performance in their price bracket after 1.5-2 years, they are more likely to hold off for the next generation. I think that's what's happening because in the past we are used to getting 50-100% increases in price/performance every 2-3 years or even every generation.

This is me. I game at 1440 and have the 7950 from May 2013 (was $330 and came with 3 AAA games).

I want to spend between $300-$350 CAD and get about 40% increase.
I would also spend about $400-450 CAD to get about 60% increase.

If I look for something between $300-350 I might get something like the r9 380, which is very similar to my 7950. Probably not even a 15% improvement.

If I look at the $400-450 range, I find the 970 which tends to be only 30-40% improvement (short of my 60% requirement for +$400).

So overall, both nvidia and amd are not giving me any upgrade options if I follow my own rules.
 

AtenRa

Lifer
Feb 2, 2009
13,984
3,347
136
This is me. I game at 1440 and have the 7950 from May 2013 (was $330 and came with 3 AAA games).

I want to spend between $300-$350 CAD and get about 40% increase.
I would also spend about $400-450 CAD to get about 60% increase.

If I look for something between $300-350 I might get something like the r9 380, which is very similar to my 7950. Probably not even a 15% improvement.

If I look at the $400-450 range, I find the 970 which tends to be only 30-40% improvement (short of my 60% requirement for +$400).

So overall, both nvidia and amd are not giving me any upgrade options if I follow my own rules.

How much can you find the R9 390 in Canada ??? Overclocked it will be close to 60% vs you HD7950.
 
Jul 26, 2006
143
2
81
Looks like some are between $410-450 when on sale, but I don't factor in OC in my math (the none OC version of that card seems to be very similar to the 970, maybe a tad better at 1440p).

At about $375 either the 970 or the 390 would be interesting. That being said, I kind of want nvidia right now for ULMB with g-sync.

It is also hard to spend +$450 right now when we might be as little as 6 months away from a node shrink.
 

james1701

Golden Member
Sep 14, 2007
1,873
59
91
Is it me, or are the high and peak years when Crysis and its sequels were released?
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
It seems like there is always one high profile game that pushes people to upgrade. That really hasn't happened in a while.

Star Citizen!

enhanced-5041-1396278170-14.jpg


Maybe it'll be the game to push graphics in a few years when it releases (if it does...) but for now it's just really poorly performing in general. SLI support might be there, last I knew Crossfire was totally broken.

Part of the problem is that at 1080p, where most gamers sit, even cheaper cards can run the most demanding games. That isn't going to change...so you'll see changes as people slowly move to 2k/4k.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Star Citizen!

enhanced-5041-1396278170-14.jpg


Maybe it'll be the game to push graphics in a few years when it releases (if it does...) but for now it's just really poorly performing in general. SLI support might be there, last I knew Crossfire was totally broken.

Part of the problem is that at 1080p, where most gamers sit, even cheaper cards can run the most demanding games. That isn't going to change...so you'll see changes as people slowly move to 2k/4k.

Very true. I run SLI because I want to run above 1080p and as close to 60fps as possible. Sometimes I'm stuck with 1080p because of how a particular game runs, some poorly optimized code and the FPS drops get annoying. I'm one of those people who absolutely refuse to turn down graphics settings (except I am willing to turn down AA or use FXAA).
 
Aug 11, 2008
10,451
642
126
Interesting thread. Good read for anyone who came to the last page and sees this.

My thoughts:
VR won't save or boost dGPUs. PS4 has VR coming for cheaper, easier to setup and accessible. I doubt the numbers of enthusiasts here on the PC VR side won't be enough to create a mass market.
The real hope for dGPUs are Steam Machines. If those take off in the living room, or people use those Steam Link and the numbers are big, that will give PC gaming new life and thus help dGPUs by extension.
While not a revolutionary thing to say, I expect dGPUs to continue to decline after a short lived bump upon the VR and 16nm dGPUs launches.

For me, my first card other than countless 2D machines for a couple decades, was a 3dfx Voodoo 4MB purchased in 1996. I no longer have interest in buying a new card and my current one may be my last. The only thing that will get me to upgrade again is if I buy into VR. But PC VR will be a relatively small percentage of gamers overall.

We do have a Wii U and love it, we buy new games all the time and will be buying Nintendo's next console on launch. And there's a lot untold about VR so far but the way to go there may be just a PS4 + Morpheus.

I'm weighing my options, but if I buy into PC VR I'd like to build another big rig with 8core Zen and 16nm GCN. If I don't, it makes more sense for my next build to fit my needs and life by going with a laptop with docking station hooked up to my existing LCDs. Playing whatever games that will support. I'm a heavy LoL player since 2010 so I have no problem there.

I expect AMD's HBM APUs next year to be a game changer. Intel's APUs (yeah I said it, it's just easier) will continue to improve. The ROI just won't be there for most to develop standalone external GPUs. Maybe AMD if they recover (in any way server/desktop/mobile). And maybe NV could survive too if they can find a new core market, like if Tegra took off.

Another option I'm considering other than an 8core VR setup or a laptop, is the mythical 300watt APU that AMD said they'll eventually produce.

I'm more excited to see how HBM APUs perform than any other upcoming innovation. I remember when sound cards mostly died, and I celebrated that moment. One less driver, no more Soundblaster PCI bus flooding, no more nonsense. DX12/Vulkan HBM APUs sound great to me.
The sky really is falling if you liked the old order. I did for 30 years.
I, for one, welcome our new APU overlords.

AMD has already delayed APUs till 2017 officially, which in actual fact probably means at least 2 years from now. Plus I dont think they have officially said when they will have HBM, except for that rumored behemoth. And with all the advancements, discrete cards will have those too, so it is just a moving target that apus will be continually playing catch up to, as they are now. The only way APUs (intel or AMD) will replace discrete cards is if it becomes economically unfeasible to develop discrete cards. Some on these forums think that will happen, but with the huge profit for workstation cards, I think we will see dgpus for the high end for at least another 5, more likely 10 or more years.
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Additionally, given the current state of world markets, as the USD strengthens against other world currencies, especially those in emerging markets and 3rd world countries, goods that sell in USD (i.e., NV/AMD graphics cards) will rise in prices relative to the wages/earnings power of gamers in non-USD earning countries.
They already have. We never had so many people paying $450+ for GPUs as we have today. It's somewhat stealthy inflation though, because it's all done under the guise of new generations of cards being released and people willingly continuing to pay more and more for them, and when it's pointed out, the excusers are quick to point out how much more performance the new cards have, as if that's relevant to the point about how prices continue to creep upward for cards sold at volume as new tiers are introduced upward. This isn't like a decade ago when ten people bought a $500 flagship card, this is thousands of people buying cards upwards of $650. It's entertaining to watch.
 
Last edited:

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
So overall, both nvidia and amd are not giving me any upgrade options if I follow my own rules.
I fail to see how your country's dollar falling in value compared to the US dollar is somehow nVidia or AMD's fault. It sounds to me as if you just need new rules for 2015, since in 2015, your country's dollar is worth 20-30% less than it was when you made that rule.
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,326
249
106
Is it a coincidence the peak in 2007 occurred right at the original Crysis release? Or was the due to the G92 8800GT?
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Neither DX12 nor VR has actually hit yet. They're both on the horizon. Its premature to say they havent uplifted graphics requirements. Well, duh. They arent in production release yet
 

SPBHM

Diamond Member
Sep 12, 2012
5,054
407
126
Is it a coincidence the peak in 2007 occurred right at the original Crysis release? Or was the due to the G92 8800GT?

seems like both were pretty good reasons... 8800GT was huge success (basically almost 8800GTX performance for a good price) and so was Crysis hardware requirements, also a lot of people still had older cards that were inadequate for the new gen of games (Geforce 7, radeon x1k), and the Radeon HD 3850/HD 3870 was also launched around that time I think, also very successful because of the good power/performance/$

basically AMD and Nvidia launched strong products (great performance and good price) at that time and people had good reasons to upgrade!?
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Is it a coincidence the peak in 2007 occurred right at the original Crysis release? Or was the due to the G92 8800GT?

Crysis was the impetus for well over a billion dollars in computer hardware sales in 2007 (probably more). Also, the 360 and PS3 were pushing the industry along, since they were in totality, quite capable machines. Multiplatform titles were starting to require multicore CPUs, higher end graphics to get the same experience as console without compromises, so PC gamers had to play keep up/catch up. All the Geforce 8000 series cards (the 8800s specifically) appeared at a good time and with a phenomenal increase in performance over anything previous making them insanely desirable an necessary to anyone wanting a good computer rig that was under no circumstances weaker than the consoles.

So like SPBHM said, good hardware products, and reasons to upgrade. Anyone who had an i5 and Radeon 7970 or comparable Geforce in 2012 already had something more powerful that the PS4 and certainly more capable when considering console optimizations. That's a year and a half before the PS4 even was released.
 
Last edited:

AVP

Senior member
Jan 19, 2005
885
0
76
It seems pretty simple to me. The four most popular games in the world right now are LoL, Dota 2, CS:GO and Hearthstone. They all basically run on budget cards from two generations ago.

The games that are pushing graphical limits are either single player games or anti-competitive spam fests like COD/BF. A lot of people like myself don't really care for either.

Any serious multiplayer game is going to prioritize frame-rate and stability over leaf textures and dynamic sunrays. Whether its chicken or egg when it comes to gameplay and design, the games that are fun for thousands of hours instead of 10-12 hardly ever require good hardware.

The only way I would immediately pay $300+ for a new graphics card over my 7870 is if nvidia or amd somehow improved the video quality of their hardware-encoded streaming capabilities. (impossible?).
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,095
513
126
Stagnation is game development due to consoles. Mobile gaming becoming huge. 1080p being a standard for a decade. It all adds up to hardware on the low end is probably doing a well enough job people dont need a dedicated GPU. The games I play still work fine on a GTX 470. These games arent horrible looking games neither. (Warthunder, World of Warships, Diablo 3, Cities XL).

I upgraded because of hardware failure(AMD 5870 died, sold the 470, bought a 660 for the kid, 770 for myself). If that 5870 didnt die 20 months ago we would still be rocking those cards today.