Discrete GPU marketshare numbers from JPR

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
The average consumer still thinks ATI/AMD makes terrible drivers, and its something to avoid.

The stigma is strong.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
The average consumer still thinks ATI/AMD makes terrible drivers, and its something to avoid.

The stigma is strong.
Yep. Granted, they're not as good (far from awful though), and they're generally only problematic with new game releases, and at the same time so are Nvidia's.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Ah..the "no true scotchman" fallacy.
Boring.

But to play the devils advocate and for shit&giggles...are you saying AMD has choosen to focus on 10% of the market and let NVIDIA cater to the other 90%?

Money has no I.Q.

Hillarious :D

The reason average Joe buys NVIDIA, and I can tell you this from working in retail, is the same reason why they buy Apple: the brand and its "premium" status. There's a few exceptions, but for the most part that's what it boils down to, even if it being premium is false. Most people that came into the store and wanted NVIDIA and NVIDIA only was because "my friend told me he had X problems with ATI 5-years-ago and he told me they sucked" or "I've heard that ATI is lower quality" or "ATI sucks" or "NVIDIA is a more premium brand". Notice how they haven't even caught up to the fact it's been AMD for around 2 years now. And yes, this happened more than 10 times with customers.

If you're smart you'll see the only current NVIDIA cards worth buying for the price are the GTX 560 Ti and the GTX 570 (that one if you don't mind the cheap voltage regulators). But again, the fact that even with that AMD doesn't sell more means the average consumer is clueless and buys, like I said, based on brands and what their misinformed "techy" friends tell them.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
I concur with Lol_Wut_Axl.

I brought a BFG NV 8400GS card, new in box, to give to a friend at a recent gathering, and someone else that was there though it was some kind of high-end thing, just because it said "Nvidia" on it.

For the record, it was for someone that had been living with 6150 integrated graphics, and I wanted them to be able to watch HD streams on the internet.

It wasn't even intended for gaming, and the 8400GS is hardly a gaming card. (Doesn't stop people from putting two of them in SLI on some ancient computer and calling it a "gaming rig" for sale on Craigslist to stupid buyers. Meanwhile, my gaming rigs with HD4850 cards garner not even an inquiry.)
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I can't trust a person who works retail, personal problem of mine. No matter what they're always trying to sell you something.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
The reason average Joe buys NVIDIA, and I can tell you this from working in retail, is the same reason why they buy Apple: the brand and its "premium" status. There's a few exceptions, but for the most part that's what it boils down to, even if it being premium is false. Most people that came into the store and wanted NVIDIA and NVIDIA only was because "my friend told me he had X problems with ATI 5-years-ago and he told me they sucked" or "I've heard that ATI is lower quality" or "ATI sucks" or "NVIDIA is a more premium brand". Notice how they haven't even caught up to the fact it's been AMD for around 2 years now. And yes, this happened more than 10 times with customers.

If you're smart you'll see the only current NVIDIA cards worth buying for the price are the GTX 560 Ti and the GTX 570 (that one if you don't mind the cheap voltage regulators). But again, the fact that even with that AMD doesn't sell more means the average consumer is clueless and buys, like I said, based on brands and what their misinformed "techy" friends tell them.

So that was a very long way to write "Yea, AMD has choosen to ignore 90% of the market".

And your subjetive views on "value" dosn't ring a bell with me...or the majority of consumers.
That is not "intelligent"...that is niche catering...and very unwise buisness wise.
 

mosox

Senior member
Oct 22, 2010
434
0
0
Apparently Perf/Watt matters only for some parts in your PC and not at all for others.

If the part is made by AMD, the Perf/Watt matters if it's bad and it doesn't matter if it's good. If the part is made by Intel/Nvidia, the Perf/Watt matters if it's good and it doesn't matter if it's bad.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
So that was a very long way to write "Yea, AMD has choosen to ignore 90% of the market".

And your subjetive views on "value" dosn't ring a bell with me...or the majority of consumers.
That is not "intelligent"...that is niche catering...and very unwise buisness wise.

Yeah, no. It's not AMD's fault that the vast majority of consumers are clueless and therefore buy based on brands.

And value isn't subjective, as much as you want to make out of it. You'll see some people spouting nonsense about how much better NVIDIA is because of things like much better drivers (they're not; most driver issues with both are PEBKAC), PhysX (which some customers like to mention even though when you ask them what it is and what games use it they'll give you a blank stare because 1) they're clueless and 2) only 1-2 game titles a year use it), and CUDA (many seem to think that makes NVIDIA cards automatically faster at gaming than AMD's for some reason). You could make the argument for Eyefinity, but again, it's only a very small niche of people that use it--certainly not average Joe. What most people want, to put it simply, is to play BF3, SW:TOR, WOW, SC2, Skyrim, and some other games. The other "fancy" features are used by very few.

When it comes to value, it boils down to this:

Radeon HD 7750 and HD 6770>GeForce GTS 450. They're faster and consume a lot less power (speaking of power consumption, electricity here is $0.26/kWh and even then most people buy NVIDIA).

Radeon HD 6790>GeForce GTX 550 Ti. Again, faster and consumes less power.

Radeon HD 6850>GeForce GTX 460 1GB 192-bit. Only GTX 460 model that's not EOL. Price is comparable, but it's much slower and consumes a lot more power.

Radeon HD 6870>GeForce GTX 560. Same performance, but the 6870 consumes less power and costs $20 less.

Radeon HD 6950=GeForce GTX 560 Ti. Radeon is a tiny bit faster and $20-30 more expensive, but consumes less power. The 2GB version of the 6950 is good for Eyefinity.

Radeon HD 6970<GeForce GTX 570. Same performance, GTX 570 is $20-30 cheaper and consumes more power. Cheap reference power circuitry should make heavy overclockers weary, though.

Radeon HD 7950 3GB> GeForce GTX 580. HD 7950 is a tiny bit faster, the same price, and consumes a lot less power.

Radeon HD 7970>? No competition for that one for now.

Radeon HD 6990 and GTX 590 I think are irrelevant due to the fact that so few people buy them and the fact they're essentially EOL now. I guess you could say they're the same performance, but the HD 6990 is very loud. At the same time, the GTX 590 consumes more power and isn't as suited as the HD 6990 for multi-monitor gaming (less VRAM makes a difference at huge resolutions like 5760x1200).

So, then, as you can see for your dollar when it comes to gaming performance overall you get more with AMD than NVIDIA, yet you see NVIDIA selling more. Why is that? See here again:

The reason average Joe buys NVIDIA, and I can tell you this from working in retail, is the same reason why they buy Apple: the brand and its "premium" status. There's a few exceptions, but for the most part that's what it boils down to, even if it being premium is false. Most people that came into the store and wanted NVIDIA and NVIDIA only was because "my friend told me he had X problems with ATI 5-years-ago and he told me they sucked" or "I've heard that ATI is lower quality" or "ATI sucks" or "NVIDIA is a more premium brand". Notice how they haven't even caught up to the fact it's been AMD for around 2 years now. And yes, this happened more than 10 times with customers.

If you're smart you'll see the only current NVIDIA cards worth buying for the price are the GTX 560 Ti and the GTX 570 (that one if you don't mind the cheap voltage regulators). But again, the fact that even with that AMD doesn't sell more means the average consumer is clueless and buys, like I said, based on brands and what their misinformed "techy" friends tell them.

Simple: most consumers are bad consumers. They buy based not on what will be the best for their money, but on things like marketing, brands, idiocy, and "what they heard".
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Yeah, no. It's not AMD's fault that the vast majority of consumers are clueless and therefore buy based on brands. *snip*

Actually it is'
Even your wall of ext dosn't alter that fact.
AMD suck at marketing.
Or alter the fact that NVIDIA outsells AMD.
Despite the hillarious "But, but but...perf/watt!!!"

And I noticed your bias in your chat.
Talk about "eyefinty" as an features.
But nothing about 3D, GPGPU, Hardware physics...ect...a cherry picked list if I ever have seen one :thumbsdown:

But you keep convincing yourself that catering to you and the rest of the *chough*"intelligent"*chough* customerbase at ~10% is the best plan for growth and longevity.

R&D is not free.
And the trend is there...now matter how much you wiggle.
AMD is loosing ground in discrete GPU...despite the whines about tje opposite on this forum by a small, vocal crowd.

I'm done here...the facts are on the table and no amount of smoke and mirrors is going to alter that.

Bye.
 

wirednuts

Diamond Member
Jan 26, 2007
7,121
4
0
I don't think they care, discrete GPUs is a dying market. Just look at the overall sales figures for the entire market, dell PC sales were down 40% this year in the EU and discrete sales sucked.

That is why both companies branched out, the writing is on the wall. Discrete sales will be irrelevant in a few years.

i totally agree. once integrated video can play most games (and they already can) then most people wont want a power wasting video card. i know when i build pc's these days, power draw is a huge factor.
 

Crap Daddy

Senior member
May 6, 2011
610
0
0
So, most of the customers are dumb, AMD's GPUs are better value in almost all aspects and the company itself has no responsability whatsoever for the fact that they consistently loose market share to the competitor. It's pure bad luck, faith or whatever.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Maybe stupid consumer doesn't mind paying few bucks for PhyX or CUDA.

Maybe he wants full screen anti-aliasing in DX10/11.
Or how about just _working antialiasing.

Perhaps he doesn't want to be bothered with drivers, and prefers the simple idea of downsampling.

Maybe he doesn't appreciate aggressive driver optimization and the race for every single FPS, maybe he wants decent looking AF, and the flicker-free image.

Maybe he values good forum, with decent support and the possibility to leave feedback for driver team.

Or how about just having all your games simply work with the least amount of hassle?
Including very old ones, including OpenGL, and the very new ones.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
CUDA? Seriously? :D I guess worthless marketing acryonyms work!

I guess people want to pay for less driver troubles, clearly as shown here:

https://www.google.com/#hl=en&outpu....,cf.osb&fp=9a76dcb319f2a4f&biw=1991&bih=1084

What do you say ATI driver apologists? When is AMD going to fix their crap linked above? This has been going on since Catalyst driver 275 WHQL which was released over a year ago. Anyway, this argument is pointless, nvidia has always had a higher discrete share, and AMD has always had a higher overall share. What is more valuable? Who knows, but nvidia is branching out with their tegra devices because discrete sales will continue to go down, thats the unfortunately reality. PC sales were down for all vendors from 20-50% in Q4, and PC sales will not improve anytime soon (discrete sales follow this trend, obviously)
 
Last edited:

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Actually it is'
Even your wall of ext dosn't alter that fact.
AMD suck at marketing.
Or alter the fact that NVIDIA outsells AMD.
Despite the hillarious "But, but but...perf/watt!!!"

And I noticed your bias in your chat.
Talk about "eyefinty" as an features.
But nothing about 3D, GPGPU, Hardware physics...ect...a cherry picked list if I ever have seen one :thumbsdown:

But you keep convincing yourself that catering to you and the rest of the *chough*"intelligent"*chough* customerbase at ~10% is the best plan for growth and longevity.

R&D is not free.
And the trend is there...now matter how much you wiggle.
AMD is loosing ground in discrete GPU...despite the whines about tje opposite on this forum by a small, vocal crowd.

I'm done here...the facts are on the table and no amount of smoke and mirrors is going to alter that.

Bye.

What facts? All you did was spout the same nonsense I often hear from consumers.

If marketing is what makes NVIDIA sell more, then they're no different in that aspect from Apple. The difference being that while people believe both companies' marketing BS, most people can't buy Apple because their products are, for the most part, hugely overpriced.

And if you actually believe that a product selling more means it's automatically better, I introduce you to this logical fallacy: en.wikipedia.org/wiki/Argumentum_ad_populum
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Yeah CUDA, seriously.

I don't mind one bit when my university GROMACS code can be compiled and executed on my home machine.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
So, most of the customers are dumb, AMD's GPUs are better value in almost all aspects and the company itself has no responsability whatsoever for the fact that they consistently loose market share to the competitor. It's pure bad luck, faith or whatever.

Of course most consumers are dumb. The vast majority of the people in this very planet are dumb. If you're still not aware of this I may have some bad news for you.
And I already laid out to you why NVIDIA sells more, and it's mainly marketing and consumer ignorance.




Maybe stupid consumer doesn't mind paying few bucks for PhyX or CUDA.

Maybe he wants full screen anti-aliasing in DX10/11.
Or how about just _working antialiasing.

Perhaps he doesn't want to be bothered with drivers, and prefers the simple idea of downsampling.

Maybe he doesn't appreciate aggressive driver optimization and the race for every single FPS, maybe he wants decent looking AF, and the flicker-free image.

Maybe he values good forum, with decent support and the possibility to leave feedback for driver team.

Or how about just having all your games simply work with the least amount of hassle?
Including very old ones, including OpenGL, and the very new ones.

Um, what is that even supposed to mean? That the consumer will pay a few extra bucks for the CUDA which they don't know what it is or the few extra bucks for the PhysX they'll never use or they don't havr sufficient hardware to take advantage of?

Just to give you an idea of just how dumb consumers are in general, I had dozens of people wanting to buy GT 210s and GT 430s for playing games like Battlefield 3 and WOW at the best settings. Some were so dumb to even say cards like the GT 520 were faster than a GTS 450 because according to them the number is higher.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Apparently Perf/Watt matters only for some parts in your PC and not at all for others.

If the part is made by AMD, the Perf/Watt matters if it's bad and it doesn't matter if it's good. If the part is made by Intel/Nvidia, the Perf/Watt matters if it's good and it doesn't matter if it's bad.

I am guessing you are comparing CPUs and GPUs then in your post?

HardOCP shows 200W difference in load power consumption between an FX8150 @ 4.6ghz and 2500k @ 4.8ghz.
1318034683VZqVQLiVuL_9_2.png


Bit-Tech shows about a 275W difference in load power consumption between an FX8150 @ 4.8ghz and 2500K @ 5.0ghz.
http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/10

On top of that FX8150 performs worse and costs more.

In contrast, there is about a 120W difference in load power consumption in games between the lowest end acceptable gaming card (i.e. HD6850) and the highest (i.e., HD7970), with everything else in between.

Another review shows 119W difference between HD6870 and GTX480 (worst offender), with everything in between them.

However, if you actually compare cards with similar performance such as HD6950 vs. GTX560 Ti or HD6970 vs. GTX570, there isn't more than 40-50W difference between them.

Now go back to your statement and see why everyone cares so much more about AMD's CPU power consumption: Bulldozer uses 200-270W more at load and still can't beat a cheaper Intel CPU.

In addition, to tame BD @ 4.6-4.8ghz, you'd likely need a $50-80 cooler (as Xbitlabs showed even the Havik 140 struggling to keep BD @ 4.6ghz under 85*C). So even more added cost because a cheap $30 CM212+ won't do. D:

The difference in power consumption between a high-end AMD and Intel CPU is a magnitude of times larger than it is between any 2 NV vs. AMD mid-range videocards or any 2 high-end videocards.
 
Last edited:

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
CUDA? Seriously? :D I guess worthless marketing acryonyms work!

I guess people want to pay for less driver troubles, clearly as shown here:

https://www.google.com/#hl=en&outpu....,cf.osb&fp=9a76dcb319f2a4f&biw=1991&bih=1084

What do you say ATI driver apologists? When is AMD going to fix their crap linked above? This has been going on since Catalyst driver 275 WHQL which was released over a year ago. Anyway, this argument is pointless, nvidia has always had a higher discrete share, and AMD has always had a higher overall share. What is more valuable? Who knows, but nvidia is branching out with their tegra devices because discrete sales will continue to go down, thats the unfortunately reality. PC sales were down for all vendors from 20-50% in Q4, and PC sales will not improve anytime soon (discrete sales follow this trend, obviously)

I can nitpick a few driver issues for NVIDIA easily, too. For example, my GTX 460 was having some problems with the display driver not responding with the latest driver at the time about three weeks ago. I downloaded a previous driver and did a clean install and problem solved. Really anyone thay complains about driver issues with any of the two is nitpicking at most since I can attest that most driver issues are PEBKAC.
 

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
While I'm sure there are a number of factors at play, it's pretty logical to lay the "blame" at AMD's APUs (the very same thing that hurts their GPU's profit numbers).
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Now go back to your statement and see why everyone cares so much more about AMD's CPU power consumption: Bulldozer uses 200-270W more at load and still can't beat a cheaper Intel CPU. In addition, to tame BD @ 4.6-4.8ghz, you'd likely need a $50-80 cooler and $30 CM212+ won't do. So even more added cost. D:

The difference in power consumption between a high-end AMD and Intel CPU is a magnitude of times larger than it is between any 2 NV vs. AMD mid-range videocards or any 2 high-end videocards.

Exactly why Bulldozer is not welcomed in this house hold. Unfortunately, when everything in my house runs off electricity, and I pay the bill with tiers that jump up almost a half dollar (man I hate my neck of the woods) this kind of crap starts to matter.

The difference from leaving my 2 PCs on (no hibernate, but low power idle modes) 1 month to another can easily be $20-30 more. Water boiler is electric so no more long showers for the GF, and we turn the heat to low in rooms we aren't in during the winters.

the savings have been amazing just changing small things in our lifestyles.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Maybe stupid consumer doesn't mind paying few bucks for PhyX or CUDA.

Maybe he wants full screen anti-aliasing in DX10/11.
Or how about just _working antialiasing.

Perhaps he doesn't want to be bothered with drivers, and prefers the simple idea of downsampling.

Maybe he doesn't appreciate aggressive driver optimization and the race for every single FPS, maybe he wants decent looking AF, and the flicker-free image.

Maybe he values good forum, with decent support and the possibility to leave feedback for driver team.

Or how about just having all your games simply work with the least amount of hassle?
Including very old ones, including OpenGL, and the very new ones.

Sign me up.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Exactly why Bulldozer is not welcomed in this house hold. Unfortunately, when everything in my house runs off electricity, and I pay the bill with tiers that jump up almost a half dollar (man I hate my neck of the woods) this kind of crap starts to matter.

The difference from leaving my 2 PCs on (no hibernate, but low power idle modes) 1 month to another can easily be $20-30 more. Water boiler is electric so no more long showers for the GF, and we turn the heat to low in rooms we aren't in during the winters.

the savings have been amazing just changing small things in our lifestyles.

Pretty sure bulldozer uses less power than sandy at idle.




I run my i5-2500k at 5 to 5.2GHz 24/7 with a 45 to 56 percent gpu overclock on both cards, power consumption is my #1 concern. :cool: