AMD's GPU Q3 2012 marketshare - 14% declines across the board to NVIDIA

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Siberian

Senior member
Jul 10, 2012
258
0
0
No. It's good that he backs his opinions with facts. It makes people think before responding. Well, mostly it does. ;)

To the subject, pretty amazing isn't it that the market goes this way. It's the difference between having a CEO who has a personal interest in the company like JHH, and hired guns who are just trying to pad their severance package and could care less about the company. It's not the consumer's fault they believe nVidia makes a better card than AMD. It's AMD and nVidia who are to blame/credit.

Another thing not being talked about here is the OEM side of things. Just because nVidia is raping the public doesn't mean they are raping the OEM market. They could be selling, for example, 680's to them far a price cheaper than 7970's at wholesale and allowing the OEM's to make more profit. We know, all else being equal, with a smaller chip and less RAM, that the 680/670 cost less to manufacture than a Tahiti based card. If you were Dell or even a boutique manufacturer and could buy 680's for less than 7970's (just to pick some easy to understand numbers) and then sell the finished product for more, what would you do? How much would that effect market share?
Going with a 600 sreies card may also mean saving money on a higher watt psu, less cooling and probably less driver support headaches. All a big bonus for a OEM.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
So explain to me why:
  • AMD lost share in Q3
  • Why you never buy ANY AMD products, even when they are clearly superior
You waited for Fermi for how long? And what Fermi did you buy, a GTX470 right? One of the worst cards in several metrics Nvidia has ever made. Be honest, consumers like you are the reason Nvidia gained share.

Don't get angry at me because AMD lost market share. I offered my reasons!
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
To be fair to him he likes 3d vision AND pHYSX ....and that's his choice because AMD doesn't emphasize on these niche features.
To be honest, he likes the above because it is under the Nvidia umbrella. That is his choice I respect it, but it becomes impossible have a meaningful discussion with someone that has blind, unconditional brand preference.

I certainly don't understand the non-sequiturs, he asks a question, then never answers it.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Shows how much launch reviews matter. As I said earlier, almost all GTX660/660Ti reviews compared NV's better after-market designs to reference HD7870/HD7950 "Boost" edition. That was a huge marketing failure on AMD's part as well.

Indeed it is a huge failure. AMD needs better reference external exhaust designs.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
Indeed it is a huge failure. AMD needs better reference external exhaust designs.
They made some odd choices, yes. But I would love to know why this site continued to use an AMD card that never actually existed outside of review samples.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
To be honest, he likes the above because it is under the Nvidia umbrella. That is his choice I respect it, but it becomes impossible have a meaningful discussion with someone that has blind, unconditional brand preference.

I certainly don't understand the non-sequiturs, he asks a question, then never answers it.

Not blind but aware.
 

NIGELG

Senior member
Nov 4, 2009
851
31
91
I'm tired of the GPU wars and I never pretended to not prefer ATi cards.But AMD is staggering in the ring now and a couple more blows could knock them down for good.

I'm still going to get me a 7970 card.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
like developer relations

AMD had better developer relations in 2012 than NV. By far the majority of new games that came out in 2012 ran faster on GCN hardware and most games that have come out have been AMD GE not NV's TWIMBTP. So that's can't be a reason. Aside from BF3 and WOW, NV didn't really have strong wins this round and now BF3 is not a win for them, but a loss.

improved gaming experiences from 3d vision

How many people who buy sub-$300 GPUs use 3D vision? AMD's 3D also works fine.


How many games in the last 2 years used PhysX? PhysX even works in BL2 on AMD-based systems. It doesn't run as well as on a dedicated NV card, but it can be unlocked. Intead, NV blocks it purposely. This hack in BL2 clearly showed NV used PhysX as a selling marketing feature as the game can be played by unloading PhysX to the CPU or even if you use a slave NV card with an AMD primary.


Worse image quality than FXAA/MLAA/MSAA. That's not a feature for anyone who has used TXAA. It looks terrible. Maybe in the future it will be a selling feature.

adaptive V-sync

Radeon Pro is a free app and it has the same called Double VSync. NV users have no problems downloading apps like NV Inspector, so if they are that advanced, they should have no problems downloading Radeon Pro.


What does CUDA do for me as a consumer? AMD cards accelerate video transcoding just as poorly as NV cards do. For video work, you want QuickSync or CPU. For most distributed computing work, NV's cards are slower. For bitcoin mining, NV's cards are unusable.

BTW, FXAA works on AMD cards and MLAA often looks better (check Dishonored). Not sure how anyone can claim FXAA to be some unique feature of NV cards. As far as driver support goes, even outdated HD4000 series still get driver releases. Frankly, cards like 8800GTX and HD4870 do not need monthly driver releases since the performance is fully maxed out and anyone who is a serious gamer has upgraded from those cards a long time ago.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Indeed it is a huge failure. AMD needs better reference external exhaust designs.

Agreed. Every reference NV card I used was so much better than AMD's. 8800GTS was uber quiet. AMD needs to redesign the blower fan as NV's fan just sounds quieter even at higher RPMs. :thumbsup:
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The point is when you describe gaming experience, it seems no amount of price drops or performance will make you abandon team NV. That's the point. If everyone thinks like you, AMD will fail automatically unless they create some marketing feature and start shoving it in games and lock out NV users. Just like Gaming Evolved is now the same dirty tactics NV used for years. The industry has become purely who locks out features and throws more $ at developers. If AMD was a company 10x bigger, every game would have DirectCompute shaders and only MSAA performance and 3GB VRAM textures. Brilliant. It's become a game of marketing and $, that's it. The only reason you think it's fair is because NV has more $ to engage in such tactics and it's your preferred brand of choice. Imagine if AMD was a huge company like Intel and locked out NV users from features just because it could pay off developers (Yup, PhysX works on AMD cards in BL2). How would you feel about that?

I don't think it's gaming features at all. It's the way these features are marketed and the way nVidia marketing jumps on any weakness in their competitor. Look at the arguments we are seeing now. We are seeing nVidia marketing constantly remind us of AMD's high prices 10 months ago. We are seeing nVidia marketing talk about perf/$ from 5 months ago. We are seeing them twist a driver improvement into a failure in AMD's drivers. We are seeing them talk about gaming performance and dev relations when almost every release this year performs better on AMD cards. We've seen one PhysX game in 2012 and only two in 2011. Somehow though people still perceive nVidia offering a better gaming experience worth paying a premium for. nVidia marketing saying they offer better image quality when, it's just a bald faced lie.

Now, instead of blaming nVidia for doing this AMD needs to counter it. This leads to exactly what you are talking about, lockouts, brand centric features, etc. What we end up with is actually a worse gaming experience because we end up with games that perform below their potential on the "wrong brand" card. Let's face it, if people are going to reward them for it, where's the initiative for them not to do it?
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,400
2,436
146
IMO, AMD cards are currently winning, as far as being the more desirable product at the high end.

To each his own, but I would not be interested in getting a 680, not when running at 1600p and the 680 can't be OCed nearly as good as 7950 or 7970.

With that said, I sure hope that competition continues, as competition is always preferable to a monopoly.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
The graphics division was one of the only things that was going ok for AMD but looks like they got hit with a big blow. Wonder how Charlie will spin this one.

Btw, could the graphics division (ATi) jump out of the sinking ship that is known as AMD?

Someone will snap up ATI when AMD goes belly up. The IP is just too good to let it go. AMD investors are hoping the GPU division continues to compete so they can possibly get some kind of pay out when (not if) AMD goes under.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Ironically, AMD did take over-all discrete market share away from nVidia in 2010 -- how was this possible?

Huge DX11 *and* 40nm lead. It wasn't until GTX 460 that NVDA had something to fight back with in terms of market share, because GTX480/470 were high-end cards that few people buy and thus not as good for gaining back big chunks of market share with. But by then a lot of people had bought ATI cards, blunting some of the impact of GTX 460.

Going with a 600 sreies card may also mean saving money on a higher watt psu, less cooling and probably less driver support headaches. All a big bonus for a OEM.

True, and let's not forget smaller size. AMD cards are still as huge as ever. The stock GTX 6xx cards are tiny by comparison, important for smaller OEM cases and HTPCs.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The graphics division was one of the only things that was going ok for AMD but looks like they got hit with a big blow. Wonder how Charlie will spin this one.

Btw, could the graphics division (ATi) jump out of the sinking ship that is known as AMD?

Whether or not ATI is a stand-alone, might not matter much. With HD4000-6000 series they offered price/performance and that didn't work. With HD7000 series, they now have price/performance, game bundles, overclocking and single-GPU performance crown and are still losing. If you buy an AMD card, you can do whatever you want, BIOS flash, overclock, push it hard as hell and that's still not selling to enthusiasts who seem to care more about saving 20-30W of power over saving $ and getting free games.

Even when AMD had the market to itself for 6 months with HD5000 series, it hardly hurt NV. Clearly, the issue is deeper. It's probably explained by brand value, features and drivers, as well as the negative association with the AMD brand as a whole. NV has PhysX and "bullet-proof" driver myth still persists. Remember when A64/X2 beat Intel's CPUs easily? Yet Intel still had the majority of market share. Sometimes consumers will simply buy a product because they feel safe buying a popular brand.

Think about it, AMD is losing this generation badly despite having a faster product line that costs less most of the time, and if you buy in retail, you can even get free games. So if you were a CEO of AMD, how do you propose to take away market share from NV, especially if you know GK110 will probably beat HD8000 easily? If AMD couldn't even take away market share this round, they are hopeless for 2013.

True, and let's not forget smaller size. AMD cards are still as huge as ever. The stock GTX 6xx cards are tiny by comparison, important for smaller OEM cases and HTPCs.

In Q2, AMD gained market share in the discrete space and HD7000 series cards were just as huge back then. What explains such a drastic market share erosion in Q3? GTX650/650Ti/660/660Ti all launched.

Huge DX11 *and* 40nm lead. It wasn't until GTX 460 that NVDA had something to fight back with in terms of market share, because GTX480/470 were high-end cards that few people buy and thus not as good for gaining back big chunks of market share with. But by then a lot of people had bought ATI cards, blunting some of the impact of GTX 460.

Reverse the situation. If ATI/AMD was 6-8 months late with HD5000 vs. Fermi and sub-$300 28nm GCN roll-out, they would have been crushed. NV does it, gets away with it and actually gains market share (they gained it with Fermi too). HD5850 was $269 and yet $199-229 GTX460 that launched 8-9 months later sold better. So people waited 8-9 months to upgrade to an NV card. We saw the exact same scenario this time as people waited 6-8 months for low-end sub-$300 Keplers. On our forum, the guys who buy AMD cards don't wait 6-8 months to buy AMD if NV has a superior product available. In other words, if GTX660Ti launched at $349 in February, people wouldn't be waiting 6 months to get an HD7950 for $299 for 6 months. That's the difference.
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
You can now count on 1 hand the number of wins GTX680 has over 7970 Ghz (Dragon Age II, Lost Planet 2, umm what else, WOW?)

Check concurrent players online at any given time, I'd be willing to wager that WoW beats every other game you have mentioned combined. nV absolutely owns the MMO market, that is the genre I play most and it is a rather shocking level of disparity in terms of the community and how the feel about the respective brands(General Chat in games frequently have people asking about hardware upgrades, always Intel+nV these days). I don't think this is because people like nV that much, AMD spent years ignoring the MMO market and was plagued with massive driver issues, it hurt their reputation far, far more then a couple of lost benchmarks ever could dream of. At this point most games seem like if someone asked which to buy, a 7970GE or a 650Ti at the same price, most people would probably say to get the 650Ti- AMD did that to themselves over the course of years neglecting, in particular, the massive driver issues they had in WoW. During the SWTOR launch AMD failed to have properly functioning drivers again, millions of people were present for that and General chat was filled with 'my game won't run properly' which was almost always quickly followed by 'get rid of your Radeon and it will'. AMD actually fixed their issues with that game rather quickly, but not before encouraging another large segment of gamers to stay away from their products.

It's great that all of the review sites like to stick to the shooter of the day that may end up selling a million units over its' life. Including turnover rate WoW is in the 30 million player range.

Right now there are a lot of people that would take a new Matrox card over a Radeon, not because they know anything at all about Matrox, just because it isn't a Radeon. AMD did that to themselves.

When AMD was in the 50% territory they were almost certainly dominating the shooter crowd to an extreme degree. Even now more then likely AMD is doing very well in that segment, that segment simply isn't all that large.

The deeper issue is that people waited 6+ months for sub-$300 Keplers and ignored 28nm GCN products. Why did the consumers wait 6 months to upgrade just to have an NV card?

Only the most lunatic of fanboys purchase video cards because they launch. People buy new video cards when they are planning on upgrading or they are building/buying a new machine. I would also point out that AMD's marketshare when they had 28nm parts shipping and nV didn't was quite a bit higher then it is now.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
AMD had better developer relations in 2012 than NV.

That's debatable! AMD did a wonderful job and defended them while you called it bribes.



How many people who buy sub-$300 GPUs use 3D vision? AMD's 3D also works fine.

I did -- had an 8800 GT --$229 GTX 260 --$229 You can play older content, if one has an extensive library of games, movies, videos, internet experiences!

Amd doesn't really have software -- you need a third party and IZ3d is really no more and really only DDD is left. AMD's HD3d is native support and proprietary - in a few titles.



How many games in the last 2 years used PhysX? PhysX even works in BL2 on AMD cards.

More than AMD! I have been consistent on this since 2006 when both ATI and nVidia were working with HavokFX.


Worse image quality than FXAA/MLAA/MSAA. That's not a feature for anyone who has used TXAA. It looks terrible. Maybe in the future it will be a selling feature.

It's about being pro-active -- trying to tackle temporal aliasing efficiently and great to see!


Radeon Pro is a free app and it has the same called Double VSync. NV users have no problems downloading apps like NV Inspector, so if they are that advanced, they should have no problems downloading Radeon Pro.

And it was just added -- great to see.

What does CUDA do for me as a consumer?

I still enjoy Just Cause 2 and the dynamic water is wonderful in the title -- wonderful sandbox title.


BTW, FXAA works on AMD cards and MLAA often looks better (check Dishonored). Not sure how anyone can claim FXAA to be some unique feature of NV cards. As far as driver support goes, even outdated HD4000 series still get driver releases. Frankly, cards like 8800GTX and HD4870 do not need monthly driver releases since the performance is fully maxed out and anyone who is a serious gamer has upgraded from those cards a long time ago.

That may indeed be the case on performance but older generational owners may welcome control panel FXAA and adaptive V-sync - nice to see some love for gamers with older hardware. Personally would like to see TXAA for Fermi, too.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Going with a 600 sreies card may also mean saving money on a higher watt psu, less cooling and probably less driver support headaches. All a big bonus for a OEM.

A 600 series? Give me a break. The 680 uses less power, about 40W, than the 7970GE. The standard 7970 and 680 are very close to the same power usage. The 7950 is very efficient and so is Pitcairn. If you over volt SI, then you see the numbers diverge. Both brands will work fine in the same cases without additional cooling. I can't believe you would try and spin marketing this way when it was nVidia that had the "Fermi approved cases". Driver support is not an OEM expense for desktop GPU's. nVidia won the mobile market because of Optimus, and that was a clean kill for nVidia and a major fail on AMD's part.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Huge DX11 *and* 40nm lead. It wasn't until GTX 460 that NVDA had something to fight back with in terms of market share, because GTX480/470 were high-end cards that few people buy and thus not as good for gaining back big chunks of market share with. But by then a lot of people had bought ATI cards, blunting some of the impact of GTX 460.

Exactly, AMD's engineering placed them in a great position and no matter how effective nVidia's marketing may or may not be -- they loss ground and share -- so much that AMD did take discrete leadership from nVidia.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
nV absolutely owns the MMO market,

And they make a commitment to MMO players by having the ability to use multi-GPU in windowed mode, while AMD does not. Even Stereo3d works in windowed mode for MMO players with nVdia. Just more marketing.
 

omeds

Senior member
Dec 14, 2011
646
13
81
I've known Pauly for many years and he has owned and recommended GPU's from both camps, along with touting features from both camps. He is also an avid graphics enthusiast, of which 3D is an extension and I know Pauly is a user of.
Nvidia is simply light years ahead of AMD in this area, so its no surprise he values Nvidia GPU's more at this point in time.

AMD's 3D also works fine.

It's not even close to Nv's solution.

The fact is, you can buy an Nvidia GPU which offers superior 3d support and quality, PhysX, AO, arguably better driver support (particularly in MGPU), and the ability to use more IQ enhancing features via 3rd party tools... or an AMD GPU which does not.

For many people, these things add value to a video card, that is to say, the value of a GPU is determined by more than fps figures alone.

Atm, I would recommend 7950 to average gamers because of great value, but to entusiasts I would recommend 670/680, due to features & support.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The 680 uses less power, about 40W, than the 7970GE. The standard 7970 and 680 are very close to the same power usage.

That's debatable too. As I said on many occasions, aftermarket HD7970 Ghz cards barely use more power than GTX680:

1351547832FAJKcoppT4_10_1.gif


When the difference in wattage is 20-30W on a 350W system, it's not material. At 1600P, HD7970 Ghz is about 10% faster than GTX680 is. GTX480 was about 15% faster than HD5870 was but used a whopping 130W more than a 5870 to get there.

HD7970Ghz costs less than 680, 10% faster at 1600P, higher performance with MSAA, higher performance with mods, 3 free games, more overclocking, uses 30W more power.

vs.

GTX480 cost more than 5870, 15% faster at 1600P, no free games, used 100-130W more power.

nVidia won the mobile market because of Optimus, and that was a clean kill for nVidia and a major fail on AMD's part.

Yup. AMD's mobile GPU strategy is a disaster, but NV won 300+ design wins for mobile before AMD even got off the ground. It has as much to do with AMD's lack of proper Optimus competitor as it has with NV aggressively targeting mobile design wins.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,400
2,436
146
Hey everyone, thanks for keeping everything civil so far for the most part. Just a friendly reminder, continue to do so because
83950959.jpg
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
the ability to use more IQ enhancing features via 3rd party tools... or an AMD GPU which does not.

Like NV only fixed their SSAA image quality just late this year, trailing AMD's SSAA for years, but NV users never talked about this.

Good thing European websites are not afraid to investigate it.
http://www.computerbase.de/artikel/grafikkarten/2012/nvidia-geforce-310.33-beta/2/

Not sure what you mean there with IQ enhancing features. Not only is SSAA superior on AMD cards in image quality but AMD cards are faster with SSAA in modern games and especially with mods. 20% faster performance with SSAA in Skyrim with mods and you get better SSAA quality than GTX680.

What superior IQ are you talking about? PhysX? Ok sure, there are maybe 6 good games that use PhysX since 2006. You make a good point about PhysX, but how much extra would you pay for that? $100, $200?

After-market $285 HD7950 with 3 free games + 5 min OC = $450 after-market GTX680. Are you willing to pay $170 more for PhysX?

Maybe it's just me but when people buy i5-2500K/i5-3570K and overclock the to the moon, they are doing it to gain more free performance and value. How is a $450 GTX680 worth it over a $285 HD7950 when they perform similarly OCed and you also get free games to boot? Isn't price/performance and overclocking what made GTX460 and 8800GT so legendary as well? Doesn't seem like HD7950 is getting the same respect.

HD7950 @ 1100mhz won't really use more power than a GTX680 either.

Let's say you want to revisit an old game with mods, AMD cards perform exceptionally well there too.
bm%202560.png


How can you talk about IQ when GTX680 continues to perform so much worse in modded games, whether we are talking about older games like HL or newer games like Skyrim with ENB.
 
Last edited: