[ PcPer ] AMD Radeon R9 290X Now Selling at $299

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Abwx

Lifer
Apr 2, 2011
10,953
3,474
136
Sorry but I don't think you get it. Nvidia claimed low power consumption as their main marketing strategy. After couple of weeks, we find out it's not true.

For one thoses cards are 185W officialy but if they want the real perfs and smoothness they ll have to cranck up the watts, seems that no one did pay attention to the fact that the higher the framerates and GPU utilisation the lower the perf/watt improvement, down to the level of 780/780ti when max throughput is reached.

Overall when looking at the numbers through different resolutions the 290 is still, one year after release, the card that has the best price/perf ratio, and a handfull of watts wont change this fact.
 
Last edited:

Sulaco

Diamond Member
Mar 28, 2003
3,860
44
91
SOURCE




PcPer found a XFX Double D at 299$ and they say that AMD dropped the price.

It is probably XFX that decided to drop the prices of its card, anyway, nobody wants a XFX especialy the Double D edition.

r92901.jpg

As someone who's in the market for a card in this range, can someone fill me in on why "no one wants an XFX, especially the DD edition" ?!?

The $299 R9 290x looked pretty tempting, but am I missing something about XFX?
 

Gtokie

Member
Sep 15, 2014
58
0
16
As someone who doesn't oc and just plays games I actually bought one of these cards and was very pleased for 303$ ar
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
As someone who's in the market for a card in this range, can someone fill me in on why "no one wants an XFX, especially the DD edition" ?!?

The $299 R9 290x looked pretty tempting, but am I missing something about XFX?

It's actually a decent card, I'm not sure why he said that but if I were to guess it's based on the 7970 dd which actually did skimp on critical parts.

The XFX DD 290/x models are high up on my list of good /decent cards.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
As someone who's in the market for a card in this range, can someone fill me in on why "no one wants an XFX, especially the DD edition" ?!?

The $299 R9 290x looked pretty tempting, but am I missing something about XFX?

http://techreport.com/review/26092/custom-cooled-radeon-r9-290x-cards-from-asus-and-xfx-reviewed/3

the XFX DD cooler used on the HD 7970 cards was not good in temps and noise levels. But XFX redesigned the DD cooler used on the R9 290 series and 280 series cards. The XFX DD cooler has changed and the cooling performance is much better. The reviews reflect the improved temps and noise levels.

http://techreport.com/review/26092/custom-cooled-radeon-r9-290x-cards-from-asus-and-xfx-reviewed/3

"Welp. Both of the cards are nearly whisper-quiet under load, registering fewer decibels on our meter than the reference GeForce GTX 780 Ti—and pretty much embarrassing the reference-cooled HIS 290X. All the while, the custom-cooled 290X cards are able to keep GPU temperatures relatively low, as well. This is a major improvement. "

http://www.overclockersclub.com/reviews/xfx_r9_290x_dd/16.htm

"Overclocking shows that when the fan speed is maxed out, the reference cooler is actually the better performer based purely on the thermals. That's a good thing, but when you add the noise generated to the equation, while the XFX R9 290X is 10 °C warmer, it's in another league altogether. By comparison, the Double Dissipation equipped card is quiet. Even when the fans are maxed out, the XFX R9 290X is, by definition, a quiet card."

http://www.hardocp.com/article/2014...tion_edition_crossfire_review/10#.VE4pB3utHhA

"The XFX Double Dissipation fans are the quietest fans we’ve ever encountered at high RPM speeds. These fans are definitely quieter than the ASUS DirectCU II fans when the RPMs and fan speed are increased. We experienced the XFX fans running at 93% fan speed, but, we did not hear them at that level. Whereas, 93% fan speed on the ASUS DirectCU II video cards is extremely loud. On the ASUS cards we wouldn’t go over 75% fan speed because the noise is too loud. However, on the XFX Double Dissipation Edition video cards the fans were literally quiet, even at near maximum fan speed. While playing games, we never noticed fan noise at all. There was no ramping up of fan speed heard while starting to game and leaving the system on all day. "

XFX though might have set the fan speed a bit too conservatively and few reviewers found that the fan speed needed to be increased to avoid thermal throttling due to VRM temps.

http://www.legitreviews.com/xfx-radeon-r9-290-double-dissipation-video-card-review_138612/12

"It is clear from our testing that XFX too conservatively set the fans on these cards and while the Double Dissipation cooler is super quiet, it lacks the airflow needed to keep VRM1 cool in low airflow chassis and for those looking to run these on an open air system"
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
No, I'm pretty sure what I said was right. Some non reference models run at 250w, and also, the testing methodologies may not be that good for some sites, so it's possible the big difference doesn't show up some times. This is a problem for many tonga and maxwell models, where one of the selling points is efficiency.

If you care about electricity so much, have you replaced your lightbulbs with LEDs? Have you updated your furnace or refrigerator in the last 5 years?

Reducing power consumption on a graphics card that remains idle or off for most of its functional lifetime is completely 100% ludicrous if you actually care about real power use reduction. It does make for a great strawman argument though...
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
It is amazing how important measuring power has become in these forums. I'm going to head over to the Ferrari foums and see how many people are bitching about fuel consumption.

Is this the HTPC section?


^ This 100x over. My god. Who cares??? These are gaming PCs. I probably save more power by diligently turning off lights when I leave a room than these people do by buying the slightly more efficient graphics card they use only occasionally. If I cared more, I'd love to evaluate how many of the same posters complaining about power usage had a different story when Fermi came out...
 

SithSolo1

Diamond Member
Mar 19, 2001
7,740
11
81
I know I've had it almost a year, and its been great, but it still stings seeing this after paying $430 for a reference 290.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
I know I've had it almost a year, and its been great, but it still stings seeing this after paying $430 for a reference 290.

That's just a common phenomena in high end equipment. At least you didn't buy a 290x then, which went from $550 to $300 in a year, or the titan which was $1000 and was matched / beat by the $400 290 in nearly the same timeframe.

You can't count on your card being a good value, and it's always a better value to go a step or two down from the top tier.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
That's just a common phenomena in high end equipment. At least you didn't buy a 290x then, which went from $550 to $300 in a year, or the titan which was $1000 and was matched / beat by the $400 290 in nearly the same timeframe.

You can't count on your card being a good value, and it's always a better value to go a step or two down from the top tier.
it is pretty crazy that 10% performance is worth 60 to 70%(350$ 970 vs 550$ 980) price premium. in the case of titan, that was pure crazyness. Do people really spend 1000$ on a gpu just for the right to brag? :) for a few months :)
 

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
it is pretty crazy that 10% performance is worth 60 to 70%(350$ 970 vs 550$ 980) price premium. in the case of titan, that was pure crazyness. Do people really spend 1000$ on a gpu just for the right to brag? :) for a few months :)

Yes they do...several people on this forum alone bought Titans.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
it is pretty crazy that 10% performance is worth 60 to 70%(350$ 970 vs 550$ 980) price premium. in the case of titan, that was pure crazyness. Do people really spend 1000$ on a gpu just for the right to brag? :) for a few months :)

Not only do they spend it for those reasons, they try to justify their purchase too when now you can buy triple 970s or even Quad-290s for nearly the same price. Not many people want to admit they wasted money on a short term emotional feel good factor rather than making purchases rationally. A friend of mine bought an iPhone 3GS and didn't upgrade until 5S. In that time his phone became too slow, too small, crappy battery life. Now he is going to keep his 5S for another 5 years I bet. Generally speaking GPUs become outdated too fast to overspend 50-70% for 15-18% more performance. It is not worth it for actual gameplay. If you resell and upgrade quicker like you did, it saves a lot more money in the long run and helps to keep your system pretty fast at each upgrade stage.

People who buy 980 at launch will just lose more $ on resale. If someone is OK reselling 980 at $275 in 2 years, then the $550 price is easily justified. I mean people spend $400-500 on a contract for the new iPhone 6S+. In that context the cost of ownership for a 980 is less than what it costs to own a smartphone over 2 years. Having said that we use our phones for work and communication 12-14 hours a day so it's probably unfair to compare the marginal utility of that device to a GPU.

It's when you have to buy several things at once that you start thinking more critically about what you spend on a GPU. For example, I can get an $85 3TB drive for media/games, $200 MX100 512 for my laptop and a new $350 970 for around the price of an after-market 980. This way I solve 3 problems instead of just graphics with a 980.
 

amenx

Diamond Member
Dec 17, 2004
3,910
2,133
136
^ This 100x over. My god. Who cares??? These are gaming PCs. I probably save more power by diligently turning off lights when I leave a room than these people do by buying the slightly more efficient graphics card they use only occasionally. If I cared more, I'd love to evaluate how many of the same posters complaining about power usage had a different story when Fermi came out...
This a red herring argument used by many. Its not the economical aspect of higher or lower power draw, but the heat and noise that comes with the territory not to mention performance headroom and throttling. Had a fermi (470) and was not happy with it due to this. Of course issue moot when you have 3rd party coolers.. which just boggles the mind how AMD did not allow their board partners to offer alternates to the ref design for a few months. Fermi ref = Hawaii ref. After my fermi experience I vowed to never buy a ref card again, even if temps/noise were not a problem simply because 3rd party coolers improve on that further for not much more money.
 

ChuckFx

Member
Nov 12, 2013
162
0
76
And meanwhile in Canada prices are still pathetically high, R9 290s are selling for same or more than r9 290Xs in the 450$ CAD range. I wish we could see the same prices or nearby. Same thing for NV side GTX 980 620$..
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
It's actually a decent card, I'm not sure why he said that but if I were to guess it's based on the 7970 dd which actually did skimp on critical parts.

The XFX DD 290/x models are high up on my list of good /decent cards.

You are right, I owned two XFX HD 7970 Reference cards and they were awsome, on the other side, the HD 7970 Double D edition was total crap and left a really bad taste in my mouth. I still taste it.

But I never said the new double D was bad, just that nobody wants it. (Nobody = me ) ^_^

And meanwhile in Canada prices are still pathetically high, R9 290s are selling for same or more than r9 290Xs in the 450$ CAD range. I wish we could see the same prices or nearby. Same thing for NV side GTX 980 620$..

Yeah that sucks, I also checked the price of the cards yesterday on www.shopbot.ca . Man those are pathetic pricing. :|
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
This a red herring argument used by many. Its not the economical aspect of higher or lower power draw, but the heat and noise that comes with the territory not to mention performance headroom and throttling. Had a fermi (470) and was not happy with it due to this. Of course issue moot when you have 3rd party coolers..

Then why certain gamers constantly kept saying for years that 7970Ghz, R9 290/290X ran hot and loud and ignored 3rd party cooler models?

fannoise_load.gif


temp.gif

http://www.techpowerup.com/reviews/Powercolor/R9_290X_PCS_Plus/27.html

78*C max overclocked on quiet setting.

There is a big difference between various R9 290 after-market coolers.

temp-testing1-645x406.jpg
 

Makaveli

Diamond Member
Feb 8, 2002
4,722
1,058
136
And meanwhile in Canada prices are still pathetically high, R9 290s are selling for same or more than r9 290Xs in the 450$ CAD range. I wish we could see the same prices or nearby. Same thing for NV side GTX 980 620$..

Totally agreed retail for the r9 290's cards in canada prices are still high.

Canada computers is still selling the 290x for $599 and 290's for $450 brand new of course.

While I can go to kijiji and find people getting rid of these cards in the $300-$400.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
apparently not all that well

http://blogs.barrons.com/techtrader...re-from-amd-in-gpu/?mod=yahoobarrons&ru=yahoo


McConnell writes that price cuts of a third by AMD are not helping that company’s sales:

In line with commentary from AMD on its Q3 earnings call that management planned to competitively reposition its graphics card lineup in Q4, the company implemented 30% price cuts on its RADEON R9 290X and 290 graphics cards, with their comparable R9 290X card now priced $150 lower than NVIDIA’s GeForce GTX 980. Despite the comparable price discount in addition to game bundles, our supply chain conversations indicate minimal share shift back to AMD. Channel partners indicate that NVIDIA’s GTX 980 low-power offering (sub-200W board), which allows for high overclocking by gamers, as well as an ability to market the GTX 980’s support of Microsoft new DX12 API, are competitive advantages for NVIDIA despite premium pricing.

McConnell’s conversations with those in the supply chain world suggest the company is already taking share from AMD, he writes:

Despite sluggish overall demand for standalone graphics cards, supply-chain conversations indicate strong demand for NVIDIA’s two recently launched high-end desktop GPU cards (GeForce GTX 980/970) since their launch in September, with channel partners indicating material share gains for NVIDIA at the expense of AMD’s legacy RADEON R9 290/280 GPUs, which were launched in fall 2013. In terms of quantifying high-end share gains and magnitude, our supply-chain conversations indicate that NVIDIA’s GeForce GTX 980/970 has comprised over 80% of high-end card shipments to channel partners since mid-September. Mercury Research estimated that NVIDIA captured 67% unit share in the $300+ desktop graphics card market in calendar Q2. While the high end of the desktop GPU market (also referred to as the gaming enthusiast segment) is approximately ~10% of desktop discrete GPU market units, it garners 60% to 65% gross margin. We estimate the desktop discrete GPU segment comprises 30% of NVIDIA overall sales.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
apparently not all that well

http://blogs.barrons.com/techtrader...re-from-amd-in-gpu/?mod=yahoobarrons&ru=yahoo


McConnell writes that price cuts of a third by AMD are not helping that company’s sales:

In line with commentary from AMD on its Q3 earnings call that management planned to competitively reposition its graphics card lineup in Q4, the company implemented 30% price cuts on its RADEON R9 290X and 290 graphics cards, with their comparable R9 290X card now priced $150 lower than NVIDIA’s GeForce GTX 980. Despite the comparable price discount in addition to game bundles, our supply chain conversations indicate minimal share shift back to AMD. Channel partners indicate that NVIDIA’s GTX 980 low-power offering (sub-200W board), which allows for high overclocking by gamers, as well as an ability to market the GTX 980’s support of Microsoft new DX12 API, are competitive advantages for NVIDIA despite premium pricing.

I worked in the financial consulting space for 5+ years and I can tell you that whatever analyst/associate wrote this piece has made the most elementary college level mistake in his analysis -- he/she has assumed the following:

1) The manufacturer/AIB price cuts transferred directly to OEMs and/or retailers

2) These prices drops in retail and wholesale channels were universal across different markets worldwide relative to competing products.

Both of these claims are 100% false based on just my assessment of the retail market in Canada.

XFX R9 290X $729 CDN
Sapphire Tri-X R9 290X $649
Gigabyte Windforce R9 290X $569

vs. Amazon pricing in the US

XFX R9 290X $349 USD
Sapphire Tri-X R9 290X $399 USD
Gigabyte Windforce R9 290X $380 USD

The US:CDN FX is 1:1.1381

Therefore, unless the major price drops on R9 290/290X cards can be seen in most retailers/OEMs worldwide, then the first part of Barron's statement: "Despite the comparable price discount in addition to game bundles, our supply chain conversations indicate minimal share shift back to AMD" is a misguided attempt to misrepresent the real world retail market situation. My boss would fire me for writing an article with such poor research and misinformation.

Now, with that said, it's AMD's fault by not working closer with retailers to get these price drops to materialize in the retail channels across the world. However, making insinuation that AMD can't even sell cards at $150+ discount because of 970/980's power usage and DX12 is simply BS when real world market prices do not reflect any price drops in Canada for example. Therefore, if the conclusion rests on the preceding premises for evidence, if the premises are untrue/flawed, the conclusion is automatically false. That's logic 101. You cannot make claims that AMD's price reduction hardly made an effect on market share when the price reduction specified in the premise has not occurred universally across most of the market worldwide.

Mercury Research estimated that NVIDIA captured 67% unit share in the $300+ desktop graphics card market in calendar Q2. While the high end of the desktop GPU market (also referred to as the gaming enthusiast segment) is approximately ~10% of desktop discrete GPU market units, it garners 60% to 65% gross margin. We estimate the desktop discrete GPU segment comprises 30% of NVIDIA overall sales.

^ This is not surprising either given what I already said above about lack of R9 290/290X price drops worldwide below $300, barring the US. Most gamers who are shopping in the $300+ space are no longer only focused on price/performance as the primary motivation for their purchase. Unlike the US where R9 290 can be purchased for $220-280 and R9 290X for $300, worldwide prices are MUCH higher. If someone is considering a $350-600 R9 290/290X, of course they will buy the $350-550 970/980. That's stating the obvious. But even if we look at historical patterns of high end GPU buyers in the $300+ space, they tend to favour NV regardless of generation (this goes back to 8800GTX days in Q4 2006).

Thirdly, if one looks at purchase patterns, PC enthusiasts overwhelmingly purchase flagship NV products until AMD has a counter in cases where AMD's alternative doesn't exist.

1) Shockingly during HD4850/4870 era, AMD lost market share from a whopping 40.8% to 31% -- exactly the time when HD4850/4870 were completely unbeatable on price performance. AMD regained some market share to 40.9% only after HD4890 was introduced.

2) Once again during Fermi era, contrary to what's repeated ad-nauseum on our forum, AMD lost market share from 44.5% to 38.8% because of GTX460/470/480 and contrary to what's said that GTX570/580 recovered things for NV, it was the opposite. During GTX570/580 era, it was actually AMD that regained market share slightly from 38.8% to 40% with HD6950/6970.

3) When Kepler's 690/680/670 pair became available, 660/650Ti weren't even on the map yet. Yet it is during this time when HD7970Ghz was the fastest card, 7970 sold for $380-400 and 7950 for $325 that NV did the most damage with 690/680/670, dropping AMD's market share from 40.3% to 33.3%. AMD only recovered after they launched 290/290X.

All of this tells us that high end PC enthusiasts have been more likely to purchase $300+ NV GPUs going all the way back to 2006. NV does the most damage to AMD's market share during the start of a new generation when they release flagship cards like 280/480/680/980 (last 2 not being true flagships of course). AMD then has to catch up over time of that particular generation. Furthermore, gamers who spend $300+ on a GPU comprise just 10% of the entire dGPU segment. Of these 10%, how many want to buy the fastest NV card? Probably a lot in % terms. NV has had the fastest card for many months with 480/580/780/780Ti/Titan and now 980. Therefore, most of the 10% is going to go to NV, which isn't surprising to see them taking nearly 70% of that space.

And what I've seen saying for years now. Even if AMD released a card with 90% of the performance of 980 for half the price, people would still buy NV. That's why 290X and 290 won't make a dent against a $550-600 980 because a lot of PC gamers want the best card/cutting edge for bragging rights and because they call themselves "true enthusiasts". For a lot of these gamers buying a value AMD card is like an insult to their hobby and their self-esteem. I personally don't care about NV or AMD. I just try to spend the least amount of $ possible to get the most performance possible during times when PC games bring my setup to its knees and whereby the upgrade becomes worthwhile. Right now my OC 7970s crush 99% of games at 1080P so I could care less about 780Ti/980/390X until some awesome game like Witcher 3 owns my cards hard - and not like drops to 50 fps but 20-25 fps on 7970s CF - totally unplayable at 1080P. :biggrin:

I can see how 10% of PC gamers would choose having the latest tech above value and price/performance because they don't want to know they are buying "last gen's tech" or 3rd or 4th best product even though the playability factor in the real world games will be more or less very very similar. That's why people hardly cared for discounted 7970s or after-market 480s or 680s despite those being "almost as good" as 580/7970GE/770. Many PC gamers will spend $100-200 more to have 10-15% faster newer card. I used to upgrade my GPUs very often and was a part of that group that wanted to have the latest tech all the time. Now I just don't care because PC games of modern times still look 95% as good with High vs. Ultra while in the past if you tried to play Unreal 2, Quake 3, Doom, Far Cry 1, Crysis 1 at even Low/Medium settings on a $300 GPU, it was a horrible experience.

----

Even Dragon Age Inquisition - supposedly a next gen RPG runs at 40 fps on a single $250 R9 290 with Ultra graphics at 2xMSAA. If you've ever tried firing up Crysis 1 on a $300-400 8800GTS 320/640 in 2007 at 1080P with 2xMSAA maxed out, you'd probably get 8-10 fps. I was getting 20-22 fps on an 8800GTS with a combination of Medium/High with no AA at 1280x1024. Upgrading the GPU is now a lot more critical for 1440P/1600P/4K users. For 1080P, a pair of 670/680/7970 cards OC are barely sweating nearly 3 years later. Unfortunately even moving up to a 1440p/4K monitor doesn't suddenly make games like AC Unity, Evil Within, COD: AW look way better since they will still have last gen's shadows, lighting and low textures and low polygon character models all over. Sad, just sad state of affairs for modern PC gaming graphics upgrades.

We've hardly had a revolution graphics since Metro 2033 to be honest. Crysis 3 and Metro LL were the last 2 games to up the graphics landscape. Even "next gen" games like Evolve look meh. You know the developers have hardly pushed the game's graphics when a 3-year old 680/280X are getting 55 fps at 1080P in Apha with 0 driver optimizations! Nowadays the focus is on 4K because graphics have completely stagnated. But the irony here is at 4K the pixel demands bring all GPUs to their knees.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Evolve_Alpha-cach-evolve_3840.jpg


We are basically stuck in an awkward period of PC gaming where you need multi-GPUs for 4K gaming while graphics at 1080P hardly look better than games from 2010. Really a transitional period it seems between last gen's focus on 360/PS3 and now just a first wave of 1st gen games made specifically for PS4/XB1 hitting PCs. I suppose 1440P/1600P is a good middle ground for now and where 970/980 are proving to be extremely popular with these gamers. However as far as next gen performance and 4K performance goes, 970/980 have nothing in them for a real breakthrough from 780Ti/290X GPUs of 1 year ago.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
RS said:
2) Once again during Fermi era, contrary to what's repeated ad-nauseum on our forum, AMD lost market share from 44.5% to 38.8% because of GTX460/470/480 and contrary to what's said that GTX570/580 recovered things for NV, it was the opposite. During GTX570/580 era, it was actually AMD that regained market share slightly from 38.8% to 40% with HD6950/6970.

AMD was the over-all discrete leader in Q2 and Q3 in 2010. Architectures are engineered for desktop and mobile discrete.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
AMD was the over-all discrete leader in Q2 and Q3 in 2010. Architectures are engineered for desktop and mobile discrete.

That's because NV lost by default in the mobile dGPU sector. Saying NV lost is a foregone conclusion when NV didn't show up with Fermi mobile parts in those 2 quarters. Yet, open the link I provided above for Desktop. NV owned AMD hard in both of those quarters with 460/470/480. Contrary to what people say on our forum, AMD's discrete market share fell from ~ 50% starting with 2900XT all the way to 30-40% and fluctuated there since middle of 2006. This idea that AMD ever had more desktop discrete GPU market share in the last 8 years is total fabrication of facts. It is actually normal for AMD to have 30-40% desktop discrete market share. Where AMD was very competitive before was mobile discrete. Since Kepler got 300+ mobile design wins, AMD lost and never seriously recovered mobile dGPU market share. We know from May 2012 to now AMD basically lost all major laptop design wins.

That means over time the overall market share should also become 30-35% since AMD no longer has 50-65% mobile dGPU market share. This is why I keep saying over and over that AMD needs to stop focusing on desktop dGPU above $350 and just spent those efforts on mobile. This way they will capture the majority of value customers and get a lot more mobile design wins. Winning the 6.7% back from NV from the 10% of TOTAL desktop market related to $300+ bracket is a serious waste of resources right now. They should focus on that segment on 5-10 years when they have enough cash flow. Right now mobile dGPU is growing much faster and AMD keeps fumbling.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
imho,

They launched in those quarters with Optimus. I have no idea where you're receiving this idea that AMD led in desktop but they did lead in over-all Discrete, which was wonderful to see. Personally desired to see around 50/50 share and my idealism did take place for a few quarters.

You may downplay efficiency in desktop but it is important for architectures for desktop (gaming/professional) and mobile and may be the most important metric to gauge architectures over-all.

Fermi may of been a robust architecture with raw potential but its strategy and lack of efficiency cost nVidia share leadership. The bigger dies were so complex and getting harder to bring to market, while AMD was executing with their small dies, strong performance with efficiency. Lessons were learned.
 
Last edited: