Discrete GPU marketshare numbers from JPR

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Pretty sure bulldozer uses less power than sandy at idle.

You should tell that to AT, guess their benches are wrong:

41714.png
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Exactly why Bulldozer is not welcomed in this house hold.

Right, but my point is people care about Bulldozer's inefficiency much more because the difference is staggering. In contrast, even Anandtech shows only 100W difference in gaming among high-end videocards, from HD6950 to GTX580.

And if you actually compare cards with similar performance, such as GTX570 vs. HD6970, there isn't much between them. I don't think most gamers really care about 20-40W difference in GPU power consumption between 2 videocards with similar performance. It comes down to warranty, features, bundled games, performance in the games they play, etc. Getting Bulldozer is like running 2 2500Ks and getting worse performance for most things....
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Yeah, I wonder why I was thinking that? Lower than Phenom II, but why did I think sandy? Bahhh Bahhhh Bahhhhhhh!!!!
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Right, but my point is people care about Bulldozer's inefficiency much more because the difference is staggering. In contrast, even Anandtech shows only 100W difference in gaming among high-end videocards, from HD6950 to GTX580.

And if you actually compare cards with similar performance, such as GTX570 vs. HD6970, there isn't much between them. That's why most users don't really care about 20-40W difference in GPU power consumption. It's a nice bonus, but not a deal breaker.

I'm sure people like me are the minority, but at the same time your argument can be applied to many features of a GPU/CPU.

We aren't Joe6Pack, so trying to use their logic fails on us, because I completely agree with Wut said - as a builder, I've had people request a card like the GT 430 2GB because it was a higher number and had more RAM than a GTS 250.

Anyways, I don't argue features for cards because those are subjective. Just like I buy motherboards with bells & whistles and down the road I never use them. People put value into different things and if we don't agree, they're clearly wrong. :) haha.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
What facts? All you did was spout the same nonsense I often hear from consumers.

If marketing is what makes NVIDIA sell more, then they're no different in that aspect from Apple. The difference being that while people believe both companies' marketing BS, most people can't buy Apple because their products are, for the most part, hugely overpriced.

And if you actually believe that a product selling more means it's automatically better, I introduce you to this logical fallacy: en.wikipedia.org/wiki/Argumentum_ad_populum

Imho,

I believe nVidia's success stems from their pro-active nature. To me, they simply do more for their consumers and if they do enough they may receive a modest premium for this work.

IT's not PR tricks but tangible differentiation.

Consumers may believe the name-brand does more because it does. AMD has to simply out work nVidia to build their name brand more.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
I'm sure people like me are the minority, but at the same time your argument can be applied to many features of a GPU/CPU.

We aren't Joe6Pack, so trying to use their logic fails on us, because I completely agree with Wut said - as a builder, I've had people request a card like the GT 430 2GB because it was a higher number and had more RAM than a GTS 250.

Anyways, I don't argue features for cards because those are subjective. Just like I buy motherboards with bells & whistles and down the road I never use them. People put value into different things and if we don't agree, they're clearly wrong. :) haha.

I almost forgot about this one. I've had to explain to quite a few people how something like a GT 210 is too slow to take advantage of more than 512MB of VRAM. Then I also have to explain that the card will not, in fact, run SW:TOR at high settings even though it has "The CUDA". As of now I haven't had anyone looking for a card that hasn't been for anything that's not gaming, a replacement for a dead card, or a basic one for normal stuff. Only exceptions are one or two people that are making new builds to run things like Adobe Premiere and then I recommend NVIDIA because of CUDA.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Of course most consumers are dumb. The vast majority of the people in this very planet are dumb. If you're still not aware of this I may have some bad news for you.
And I already laid out to you why NVIDIA sells more, and it's mainly marketing and consumer ignorance.






Um, what is that even supposed to mean? That the consumer will pay a few extra bucks for the CUDA which they don't know what it is or the few extra bucks for the PhysX they'll never use or they don't havr sufficient hardware to take advantage of?

Just to give you an idea of just how dumb consumers are in general, I had dozens of people wanting to buy GT 210s and GT 430s for playing games like Battlefield 3 and WOW at the best settings. Some were so dumb to even say cards like the GT 520 were faster than a GTS 450 because according to them the number is higher.

What is it with you and "intelligence" :confused:
It's hillairous, especially because the consumers vote with their wallets.

You are like a cult leader, claming the rest of the world is dumb...becuase they don't view the world like you do (cherry-picking). o_O

There is nothing "intelligent" buisnesswise about catering to 10% and ignoring the rest 90%..and letting the competition take their money :thumbsdown:
(As the numbers show)

Only a less "intelligent" (You choose to label people that don't care (like me eg.) about perf/watt as other than "intelligent"...live with being slammed for it!)would defend that "strategy"...and not from facts or buisness-strategy, but an emotional/fan viewpoint.

That dosn't makes money, that dosn't pay sallaries or feed the R&D machine.
It's the recipe for obscurity and insignificance.

Ask VIA and Cyrix about how focus on peerf/watt paid off...or are you once again going to blame "dumb" consumers.

So dumb that marketing cannot reach them? :rolleyes:
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
What is it with you and "intelligence" :confused:
It's hillairous, especially because the consumers vote with their wallets.

You are like a cult leader, claming the rest of the world is dumb...becuase they don't view the world like you do (cherry-picking). o_O

There is nothing "intelligent" buisnesswise about catering to 10% and ignoring the rest 90%..and letting the competition take their money :thumbsdown:
(As the numbers show)

Only a less "intelligent" (You choose to label people that don't care (like me eg.) about perf/watt as other than "intelligent"...live with being slammed for it!)would defend that "strategy"...and not from facts or buisness-strategy, but an emotional/fan viewpoint.

That dosn't makes money, that dosn't pay sallaries or feed the R&D machine.
It's the recipe for obscurity and insignificance.

Ask VIA and Cyrix about how focus on peerf/watt paid off...or are you once again going to blame "dumb" consumers.

So dumb that marketing cannot reach them? :rolleyes:

LOL, nice "counterargument", if you can even call it that. I'm done with you.

If you actually think most people are intelligent you haven't been out very much.

Until you have something that's not name calling or petty arguments that have no technical or factual background, don't bother.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
To get back more on-topic, I think the market for discrete GPUs in the $75-150 range that are CAPABLE will be the main growth area. The 7750 is a great example. PC makers can offer this card as an upgrade on most OEM systems without needing expensive upgrades in the PSU and case heat mitigation (more fans, better airflow, etc).

The days of buying a $75 GPU that is waste of $$$ is over. I expect that if I am paying a $100 for a discrete card, it should offer me at least 2-3x the performance of my integrated GPU (Llano/SB/etc).

For folks who built their PCs back in the early/mid 90's, it has come full-circle. I remember seeing terrible discrete GPUs available on ISA with 1 or 2MB of RAM that were actually worse than most onboard VGA solutions at the time. The same was true for even some of the PCI solutions that were close to $200. It was very confusing for consumers because they had no idea what to buy, and if it was truly any better.

Skip ahead 5 years or so when AGP was released and it was so much easier to know what you were getting. You could get a solid discrete GPU for <$200 that was 50x the speed of your onboard. It was a no-brainer.

In my opinion, the market is shifting to the following:

-integrated GPUs (Fusion/Intel HD xxxx)
-Cheap discrete/no addtl power needed (7750-class; $75-150)
-Mainstream discrete (7850/7870/GTX 560; $150-300)
-Mid-range to top-end (7970/GTX 580) $300+

Of the overall GPU pie, integrated will continue to gain overall marketshare until the market gets saturated and a majority of all new PCs come with iGPU; that should happen pretty soon (12-18months). Pretty much only low-end AMD CPUs do not have a GPU *(Phenom I/II). Of the discrete group, the cheap $75-150 will probably have the most growth, assuming it stays well ahead of integrated GPU performance. Folks can plunk-down $125 and get a GPU that will play most every game on a single 1080p monitor, albeit not with the best eye candy, but equivalent or better than most consoles. Volume on the $150+ GPUs will continue to grow somewhat, but probably will start to skew more to the highs and lows. As the sub-$150 gets more capable, that may eat-away at the $200-250 range GPUs, but likely will never touch the $400+ SKUs.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
LOL, nice "counterargument", if you can even call it that. I'm done with you.

If you actually think most people are intelligent you haven't been out very much.

Until you have something that's not name calling or petty arguments that have no technical or factual background, don't bother.

OMG :D
Pot, kettle, black.

All you excuses dosn't alter the numbers.

You can then put your fingers in your ears and close your eyes..it won't help either.

The numbers don't lie.
Consumers choose NVIDIA more than they choose AMD.

But I guess you would throw a hissyfit if I pointed out the problem isn't the consumers...but AMD.

It's like the old saying:
"Don't blame the studnet...blame the teacher"

But I can see this chart is upsettings.
It has totally eroded the red teams favourite arguments (perf/watt, sweet spot)...they are left with less performance, less features...and less sales.
Even if the posters have been very vocal and declared a win in advance...semms like they sold the skin before they shot the "bear".

Makes you think who is the "intelligent" ones...right?
 

Martimus

Diamond Member
Apr 24, 2007
4,488
152
106
This is a graph of discrete video cards, so of course nVidia is gaining ground there. AMD has obsoleted their own low end video cards with APU's, so this will continue to happen until either nVidia develops their own APU, develops an Integrated GPU that obsoletes their low end video cards, or leaves the low end discrete video card business.

This graph really shows the difference in the two companies strategy more than anything. What I would be more interested in is the percentage of cards for >$200, since that would show how each is doing in discrete video card sales that both are actually competing in.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
This is not new really, nVidia has usually had a significant desktop discrete share before the invent of APU's -- what explained that?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
CUDA? Seriously? :D I guess worthless marketing acryonyms work!

Sure, because Cuda does hold some value to GeForce with Cuda features for games and applications, allows the GPU Physic component for PhysX and helps with the ambient occlusion feature as well, from my understanding.
 

sojuhasu

Member
Feb 17, 2010
27
0
0
OMG :D
Pot, kettle, black.

All you excuses dosn't alter the numbers.

You can then put your fingers in your ears and close your eyes..it won't help either.

The numbers don't lie.
Consumers choose NVIDIA more than they choose AMD.

But I guess you would throw a hissyfit if I pointed out the problem isn't the consumers...but AMD.

Wow, thats ... argumentum ad populum in its pure form. I am not sure what is it you are trying to say here. Yes, more consumers are buying NVIDIA at the moment. No that does not automatically make it the better product. Otherwise McDonald would also be the tastiest and the most nutritious food ever.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Imho,

Actually when it comes to shipping, well, consumers choose Intel when it comes to GPU's and then AMD -- with nVidia a distant third if one goes by over-all share.

So what does nVidia do to combat the stranglehold and vice of x86? It's ironic in a way with all the clamoring about proprietary and open standards and yet Intel uses that proprietary x86 like a club to competition.

It's going to be interesting to see what nVidia may do.