[KitGuru] Sales of desktop graphics cards hit 10-year low in Q2 2015

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Stagnation is game development due to consoles. Mobile gaming becoming huge. 1080p being a standard for a decade. It all adds up to hardware on the low end is probably doing a well enough job people dont need a dedicated GPU. The games I play still work fine on a GTX 470. These games arent horrible looking games neither. (Warthunder, World of Warships, Diablo 3, Cities XL).

I upgraded because of hardware failure(AMD 5870 died, sold the 470, bought a 660 for the kid, 770 for myself). If that 5870 didnt die 20 months ago we would still be rocking those cards today.
HA, my situation was similar, but I 100% blame it on nvidia :p nv drivers was causing endless driver crashes and occasional blue screen of death for my 460 1gb. I went for a 660 ti. than I upgraded to a 280x. 3 months later I got a 290 :cool: I lost 20 to 40% resale value on the in between cards. 460 1gb is still in my sister's pc with a very very old working driver(it is probably now almost 3 years old). almost upgraded again this gen till I saw the crap benchmarks.

all that trouble because of crap nv drivers.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Desktop graphics cards are slowing down for several reasons. Here are a few off the top of my head.

1. Mobile gaming market, rise of casual gaming.
2. Successful next gen console sales
3. Integrated graphics are fast enough to play the most heavily played games of today.
4. Lack of graphically demanding games. 3+ year old graphics cards are fast enough to play practically any game at 1080P.
5. Most innovative games are in the Indy gaming space which are generally not that graphically demanding.
6. Dollar dropping outside of US markets pretty hard this year driving prices up.
7. Performance advances in graphic cards have slowed down dramatically.

Etc.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Actually the weak consoles are a reason why sales drop. Not because of the sales.

Weak console=weak demanding port.

And the biggest issue may be that people are stuck on 1080p. If we exclude the financial part of it in terms of the world. People in the US and Europe simply dont have the same money to spend.
 
Last edited:

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,330
4,917
136
How many of you even use integrated graphics to game on? I travel for work and so have been making do with using my laptop (i5 5200U + Intel HD 5500) and it is MISERABLE on minimum settings on Heroes of the Storm.

With 14nm FinFET + HBM2 I can see it getting better. That might actually cannibalize some sales on the low end. But Intel HD graphics? Please. It's still terrible.

I think the bigger issue overall is 28nm stagnation and marked increase of prices. Used to be a flagship card was <$400 and you could buy a pretty decent low end card for $100 or less. With the sweet spot being $150-250. Back when I had to budget for gaming upgrades I mainly looked at the $150-250 bracket and usually upgraded when I could double my performance.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
How many of you even use integrated graphics to game on? I travel for work and so have been making do with using my laptop (i5 5200U + Intel HD 5500) and it is MISERABLE on minimum settings on Heroes of the Storm.

With 14nm FinFET + HBM2 I can see it getting better. That might actually cannibalize some sales on the low end. But Intel HD graphics? Please. It's still terrible.

I think the bigger issue overall is 28nm stagnation and marked increase of prices. Used to be a flagship card was <$400 and you could buy a pretty decent low end card for $100 or less. With the sweet spot being $150-250. Back when I had to budget for gaming upgrades I mainly looked at the $150-250 bracket and usually upgraded when I could double my performance.
I used to a long time ago. I beat gta San Andreas on my laptop in high school a long time ago. Portable small thing all settings turned down but it was amazingly fun to have a portable machine to game on that was so small. Intel igp is infinitely better than then, so I can see why lower end gamers would be happy with the igp. I was back then.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
How many of you even use integrated graphics to game on? I travel for work and so have been making do with using my laptop (i5 5200U + Intel HD 5500) and it is MISERABLE on minimum settings on Heroes of the Storm.

With 14nm FinFET + HBM2 I can see it getting better. That might actually cannibalize some sales on the low end. But Intel HD graphics? Please. It's still terrible.

I think the bigger issue overall is 28nm stagnation and marked increase of prices. Used to be a flagship card was <$400 and you could buy a pretty decent low end card for $100 or less. With the sweet spot being $150-250. Back when I had to budget for gaming upgrades I mainly looked at the $150-250 bracket and usually upgraded when I could double my performance.


I think you are onto something with the pricing. Even the step down from a flagship gpu is expensive compared to years ago and there isn't many titles that demand the latest gpu unless you are pushing high resolution.
 

biostud

Lifer
Feb 27, 2003
18,246
4,756
136
Also display makers need to step up their game. For me to upgrade my 1440p display and also videocard, I really want something substancially better. 32" 1440p 144hz or 4K 30-75hz IPS/MHVA freesync, and I will gladly upgrade my videocard along with my screen.
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
Star Citizen!

That's enough reason to keep my business in the dGPU market on its own. SC in VR is unlike anything a console will be able to touch. And I'm not expecting much out of Morpheus. It has to fit within the PS4's hardware abilities which will diminish the experience. Like Google Cardboard, yeah it gives you a good taste but it is nothing like sticking your head in a proper PC VR headset.
 

Mushkins

Golden Member
Feb 11, 2013
1,631
0
0
Today, it's impossible to buy a $200-250 GPU and get 85-90% of the performance of the flagship part. In the past, this wasn't unusual. On average the great desktop discrete GPUs are simply more expensive imo, which forces people to wait longer between upgrades (i.e., if I am spending $650+ USD on a new GPU which for me is now almost $1000 Canadian (!), I want 2-3X the performance increase, while in the past I was happy with a 40-50% increase).

While this is true, I also think this has a whole lot to do with the fact that the "HD" movement has died down almost completely and the "next big thing" of 3D was a complete and utter flop (yet again). I upgraded to a GTX 970 almost a year ago, giving a friend my Radeon HD7970. Before that, she was using my old 5850, a card from 2009 that was still playing most games at 1080p at 45-60 FPS on medium-high, maybe with a few shadows and fluff effects turned down. She was perfectly happy with that 5850 and still would have been today, I just didn't want a 7970 sitting in the closet literally collecting dust.

The vast majority of people are completely happy with that level of performance. If a six year old GPU more than meets their gaming needs, they're not in any rush to buy a new one. Especially considering the rise of the indie gaming scene where for many gamers the majority of games they're playing are *still* going to look and run as good as they possibly can on ancient hardware *years* from now. There's also the rise of mobile and tablet gaming to consider.

So yeah, I'm not surprised by these published numbers at all. They're more than likely going to continue to decline as discrete GPUs fall into an even smaller niche product just like they used to be.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
I think you are onto something with the pricing. Even the step down from a flagship gpu is expensive compared to years ago and there isn't many titles that demand the latest gpu unless you are pushing high resolution.
In all reality what's the fidelity difference between ps4, mid range, and top of the line gpu? Not much..
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
or if its rendering under 1080p (900, etc.) and upscaling to 1080p versus native @ 1080p. Or if there are pc specific effects because its not a garbage port (like the many upgraded options in GTA V)

No jaggies in a game with lots of lines is pretty obvious when you can get 4xaa or higher

the most obvious difference is when you play on a gpu thats capped out at 60 versus a gpu that occasionally hits 60 but varies. The drops in frame rates cause very noticeable choppiness compared to the smoothness of a consistent 60. (on a non-adaptive sync monitor, of course). This is why at times it is actually better to cap at 30 than to let it fluctuate in the low 30s
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
In all reality what's the fidelity difference between ps4, mid range, and top of the line gpu? Not much..

This chart comes to mind. The console bar was simply way too low.

ps4-vs-720.jpg
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
The hell is up with that Y axis? Makes it look like as if the graph was made by a kid.

Yeah I had to do a double take that graph to understand it. These non logarithmic scales are done usually by marketing teams. What that graph doesn't take into consideration is bare metal or close to metal programming. Comparing FLOPS as a measurement with apples and oranges is flop worthy. Silly chart.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Yeah I had to do a double take that graph to understand it. These non logarithmic scales are done usually by marketing teams. What that graph doesn't take into consideration is bare metal or close to metal programming. Comparing FLOPS as a measurement with apples and oranges is flop worthy. Silly chart.

Its a log scale.

And bare metal or close to metal is completely irrelevant in terms of GPU flops. Maybe you confused that with the CPU overhead.

While flops in terms of relative performance depend a lot on the actual uarch. Its still one of the best methods for compare, despites its inherit flaws in relation to actual performance.

The point that the current generation of underpowered discount consoles haunt the GPU sales is still standing.
 
Last edited:

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
I still dont get your point. Its just a log scale on the Y axis. Else the graphs would end close to unreadable.

If a graph requires a change of scale, you use an additional graph. The graph pictured is badly done.
 

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
Show how it should be done and still deliver the point.

..Uh.

You see where it has those lines @ "10", "100" and "1,000"? Take the data for each of those boundaries, and chuck 'em into the graphing software, outputting four separate graphs.

This stuff shouldn't even need to be argued over; correctly making graphs is one of the fundamental lessons taught in the first year of high school.
 
Last edited:

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Its a log scale.

And bare metal or close to metal is completely irrelevant in terms of GPU flops. Maybe you confused that with the CPU overhead.

While flops in terms of relative performance depend a lot on the actual uarch. Its still one of the best methods for compare, despites its inherit flaws in relation to actual performance.

The point that the current generation of underpowered discount consoles haunt the GPU sales is still standing.

Not true, bare metal programming allows for higher FLOP counts which consoles get the benefit of, this is untrue of PC games due to the overhead of Direct X / Open GL. This may change some with DX12 but there will always be more overhead driving games through an OS like Windows compared to gaming consoles operating systems.

I also disagree that the power of the current or last gen consoles has a direct correlation with the reduction of GPU sales. Consoles have always led as the primary development platforms so your argument makes no sense.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
In all reality what's the fidelity difference between ps4, mid range, and top of the line gpu? Not much..


Maybe to you but in many titles I see a striking difference. The textures are sharper, draw distances are further out, less blurring overall etc. it all depends on the game in question.
 
Last edited:

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Not true, bare metal programming allows for higher FLOP counts which consoles get the benefit of, this is untrue of PC games due to the overhead of Direct X / Open GL.

Sure, but those "bare metal" optimizations should be the same for every console generation. The graph works because EVERY generation of consoles get optimizations (not just the PS4 and Xbone) so their relative position to PCs of that time IS comparable.

What that graph shows more than anything is that this generation of consoles is the weakest one in recent history compared to the PC hardware of that time. When the Xbox 1 came out you couldn't buy a GPU that good. When the Xbox 360 came out we couldn't buy GPUs with universal shaders like that (with that level of power) for almost a year after release. The 360 was a leap beyond ANYTHING PCs could do. Meanwhile the PS4 (the strongest current console) had a GPU you COULD pretty much buy at release (basically a 7850 or 7870) but more importantly it was less powerful than GPUs that were on the market at the time of release (namely the 7970). We have never seen that happen in the recent history of consoles.

It doesn't matter that maybe, just maybe, thanks to optimizations that PS4 GPU can match the 7970. What matters is that when you look at the history of GPUs and consoles the STANDARD is for consoles to be better than anything on PCs at release until this generation. That holds back the entire gaming industry, because instead of the consoles setting a new high water mark for everyone we instead got midrange console parts that you could blow away with a highend PC built the exact day the PS4 was released. Optimizations in the past were a way for consoles to keep any advantage over PCs longer, while with this generation the optimizations are the ONLY HOPE for that advantage day 1. It is a huge difference.

I don't care how well the consoles can be optimized, it doesn't turn a 7870 into a Titan. The xbox 360 and Ps3 generation lasted so long partially because they had such a head start on the PC market (moreso for the 360), while this generation the consoles are an anchor around the neck of the PC market. I get why they are what they are- MS and Sony didn't want to take a massive losses per unit this generation like in the past. The Gillette model for console economics is obviously breaking down, and we are all worse off because of it.

It is what it is, which is we are facing the weakest consoles relatively in the entire history of gaming (since Atari at least). What makes it even worse in my opinion is that the console market has moved away from the exclusives model of the Ps1/2 era to a model where almost every game gets ported to at least one other platform. That means that we can't even rely on the concept that PCs are only held back by the PS4, they are held back by the Xbox One from now until MS calls it quits. MS made a decision to prioritize Kinect over console power which made the Xbox One the weakest relative console of all time.

We haven't even fully seen what the effect of this will be, because as of now many AAA games (like GTA V or MGS V) are ported all the way back to the last gen consoles which REALLY holds back every other platform. I have hope that maybe, just maybe, developers can find a way to put out top shelf PC games without being held back by the consoles. When a game like Ryse can be exclusive to the weakest console, but then as a port look like one of the most visually appealing PC games ever, it should all give us hope. Console optimizations are not the answer, developers putting in the work to give us good PC ports is the answer.