apparently not all that well
http://blogs.barrons.com/techtrader...re-from-amd-in-gpu/?mod=yahoobarrons&ru=yahoo
McConnell writes that price cuts of a third by AMD are not helping that company’s sales:
In line with commentary from AMD on its Q3 earnings call that management planned to competitively reposition its graphics card lineup in Q4,
the company implemented 30% price cuts on its RADEON R9 290X and 290 graphics cards, with their comparable R9 290X card now priced $150 lower than NVIDIA’s GeForce GTX 980. Despite the comparable price discount in addition to game bundles, our supply chain conversations indicate minimal share shift back to AMD. Channel partners indicate that NVIDIA’s GTX 980 low-power offering (sub-200W board), which allows for high overclocking by gamers, as well as an ability to market the GTX 980’s support of Microsoft new DX12 API, are competitive advantages for NVIDIA despite premium pricing.
I worked in the financial consulting space for 5+ years and I can tell you that whatever analyst/associate wrote this piece has made the most elementary college level mistake in his analysis -- he/she has assumed the following:
1) The manufacturer/AIB price cuts transferred directly to OEMs and/or retailers
2) These prices drops in retail and wholesale channels were universal across different markets worldwide relative to competing products.
Both of these claims are 100% false based on just my assessment of the retail market in Canada.
XFX R9 290X $729 CDN
Sapphire Tri-X R9 290X $649
Gigabyte Windforce R9 290X $569
vs. Amazon pricing in the US
XFX R9 290X $349 USD
Sapphire Tri-X R9 290X $399 USD
Gigabyte Windforce R9 290X $380 USD
The US:CDN FX is 1:1.1381
Therefore, unless the major price drops on R9 290/290X cards can be seen in most retailers/OEMs worldwide, then the first part of Barron's statement:
"Despite the comparable price discount in addition to game bundles, our supply chain conversations indicate minimal share shift back to AMD" is a misguided attempt to misrepresent the real world retail market situation. My boss would fire me for writing an article with such poor research and misinformation.
Now, with that said, it's AMD's fault by not working closer with retailers to get these price drops to materialize in the retail channels across the world. However, making insinuation that AMD can't even sell cards at $150+ discount because of 970/980's power usage and DX12 is simply BS when real world market prices do not reflect any price drops in Canada for example. Therefore, if the conclusion rests on the preceding premises for evidence, if the premises are untrue/flawed, the conclusion is automatically false. That's logic 101. You cannot make claims that AMD's price reduction hardly made an effect on market share when the price reduction specified in the premise has not occurred universally across most of the market worldwide.
Mercury Research estimated that NVIDIA captured 67% unit share in the $300+ desktop graphics card market in calendar Q2. While the high end of the desktop GPU market (also referred to as the gaming enthusiast segment) is approximately ~10% of desktop discrete GPU market units, it garners 60% to 65% gross margin. We estimate the desktop discrete GPU segment comprises 30% of NVIDIA overall sales.
^ This is not surprising either given what I already said above about lack of R9 290/290X price drops worldwide below $300, barring the US. Most gamers who are shopping in the $300+ space are no longer only focused on price/performance as the primary motivation for their purchase. Unlike the US where R9 290 can be purchased for $220-280 and R9 290X for $300, worldwide prices are MUCH higher. If someone is considering a $350-600 R9 290/290X, of course they will buy the $350-550 970/980. That's stating the obvious. But even if we look at historical patterns of high end GPU buyers in the $300+ space, they tend to favour NV
regardless of generation (
this goes back to 8800GTX days in
Q4 2006).
Thirdly, if one looks at purchase patterns, PC enthusiasts overwhelmingly purchase flagship NV products until AMD has a counter in cases where AMD's alternative doesn't exist.
1) Shockingly during HD4850/4870 era, AMD lost market share from a whopping 40.8% to 31% -- exactly the time when HD4850/4870 were completely unbeatable on price performance. AMD regained some market share to 40.9% only after HD4890 was introduced.
2) Once again during Fermi era, contrary to what's repeated ad-nauseum on our forum, AMD lost market share from 44.5% to 38.8% because of GTX460/470/480 and contrary to what's said that GTX570/580 recovered things for NV, it was the opposite. During GTX570/580 era, it was actually AMD that regained market share slightly from 38.8% to 40% with HD6950/6970.
3) When Kepler's 690/680/670 pair became available, 660/650Ti weren't even on the map yet. Yet it is during this time when HD7970Ghz was the fastest card, 7970 sold for $380-400 and 7950 for $325 that NV did the most damage with 690/680/670, dropping AMD's market share from 40.3% to 33.3%. AMD only recovered after they launched 290/290X.
All of this tells us that high end PC enthusiasts have been more likely to purchase $300+ NV GPUs going all the way back to 2006. NV does the most damage to AMD's market share during the start of a new generation when they release flagship cards like 280/480/680/980 (last 2 not being true flagships of course). AMD then has to catch up over time of that particular generation. Furthermore, gamers who spend $300+ on a GPU comprise just 10% of the entire dGPU segment. Of these 10%, how many want to buy the fastest NV card? Probably a lot in % terms. NV has had the fastest card for many months with 480/580/780/780Ti/Titan and now 980. Therefore, most of the 10% is going to go to NV, which isn't surprising to see them taking nearly 70% of that space.
And what I've seen saying for years now. Even if AMD released a card with 90% of the performance of 980 for half the price, people would still buy NV. That's why 290X and 290 won't make a dent against a $550-600 980 because a lot of PC gamers want the best card/cutting edge for bragging rights and because they call themselves "true enthusiasts". For a lot of these gamers buying a value AMD card is like an insult to their hobby and their self-esteem. I personally don't care about NV or AMD. I just try to spend the least amount of $ possible to get the most performance possible during times when PC games bring my setup to its knees and whereby the upgrade becomes worthwhile. Right now my OC 7970s crush 99% of games at 1080P so I could care less about 780Ti/980/390X until some awesome game like Witcher 3 owns my cards hard - and not like drops to 50 fps but 20-25 fps on 7970s CF - totally unplayable at 1080P. :biggrin:
I can see how 10% of PC gamers would choose having the latest tech above value and price/performance because they don't want to know they are buying "last gen's tech" or 3rd or 4th best product even though the playability factor in the real world games will be more or less
very very similar. That's why people hardly cared for discounted 7970s or after-market 480s or 680s despite those being "almost as good" as 580/7970GE/770. Many PC gamers will spend $100-200 more to have 10-15% faster
newer card. I used to upgrade my GPUs very often and was a part of that group that wanted to have the latest tech all the time. Now I just don't care because PC games of modern times still look 95% as good with High vs. Ultra while in the past if you tried to play Unreal 2, Quake 3, Doom, Far Cry 1, Crysis 1 at even Low/Medium settings on a $300 GPU, it was a horrible experience.
----
Even Dragon Age Inquisition -
supposedly a next gen RPG runs at
40 fps on a single $250 R9 290 with Ultra graphics at 2xMSAA. If you've ever tried firing up Crysis 1 on a $300-400 8800GTS 320/640 in 2007 at 1080P with 2xMSAA maxed out, you'd probably get 8-10 fps. I was getting 20-22 fps on an 8800GTS with a combination of Medium/High with no AA at 1280x1024. Upgrading the GPU is now a lot more critical for 1440P/1600P/4K users. For 1080P, a pair of 670/680/7970 cards OC are
barely sweating nearly 3 years later. Unfortunately even moving up to a 1440p/4K monitor doesn't suddenly make games like AC Unity, Evil Within, COD: AW look way better since they will still have last gen's shadows, lighting and low textures and low polygon character models all over. Sad, just sad state of affairs for modern PC gaming graphics upgrades.
We've hardly had a revolution graphics since Metro 2033 to be honest. Crysis 3 and Metro LL were the last 2 games to up the graphics landscape. Even "next gen" games like Evolve look meh.
You know the developers have hardly pushed the game's graphics when a 3-year old 680/280X are getting 55 fps at 1080P in Apha with 0 driver optimizations! Nowadays the focus is on 4K because graphics have completely stagnated. But the irony here is at 4K the pixel demands bring
all GPUs to their knees.
We are basically stuck in an awkward period of PC gaming where you need multi-GPUs for 4K gaming while graphics at 1080P hardly look better than games from 2010. Really a transitional period it seems between last gen's focus on 360/PS3 and now just a first wave of 1st gen games made specifically for PS4/XB1 hitting PCs. I suppose 1440P/1600P is a good middle ground for now and where 970/980 are proving to be extremely popular with these gamers. However as far as next gen performance and 4K performance goes, 970/980 have nothing in them for a real breakthrough from 780Ti/290X GPUs of 1 year ago.