AMD focusing too much on the desktop?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Aug 11, 2008
10,451
642
126
I am not sure if the OP is talking about GPUs or CPUs or both. However, I will address dgpus. I think AMD is inconceviably making the same mistake in gpus that they made in cpus: Ignoring power consumption. Right now AMD desktop gpus are a great buy, but they only want to seem to focus on bringing out some 300 watt monster that will take the performance lead from nVidia. I know many on these forums applaud that, and deride power efficiency, but with the shift to mobile and even toward small form factors on the desktop, efficiency is ultimately where the market is moving. So I would not say it is exactly focusing too much on desktop, but failing to focus on efficiency. And it doesnt help that they lost a good portion of the laptop dgpu market due to poor implementation of enduro or whatever their switching method was called compared to optimus. And unless they make a major advance, the efficiency of Maxwell is pretty much going to lock them out of the laptop gpu market moving forward.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
According to online stats, there are more laptop than desktop sales every year and the ratio seems to be increasing in favor of laptops. So what does AMD have to offer for the laptops?

-high end gaming dominated by Intel(CPU) and Nvidia
-high to mid-end "multimedia" laptops which are pretty much a poor man's gaming laptop again dominated by Intel(CPU) and Nvidia
-ultrabooks all Intel
-low end, everyday use, good enough laptops either Intel or an AMD APU

Every single student I know will go for a decent laptop if they are a gamer or an ultrabook if they want to carry it around. So no sales for AMD here.

Now we are getting the r9 300 series soon but according to all the rumors, this is going to be a huge beast with liquid cooling and based on that, the 300W TDP is probably correct too. So it will be great for desktop but again, completely unusable on the laptop.

It seems that AMD is missing a huge chunk of the PC market. I really wonder what is the ratio between mobility and desktop sales for Nvidia because that would really put it into the perspective.

Thoughts?
http://wccftech.com/amd-fiji-xt-r9-390x-cooler-master-liquid/


Maybe back to the topic here -

I think AMDs relevance in GPUs hinges on the 3xx series. They need a part that matches the 750 Ti at <= 75W with no 6-pin, suitable for the masses of 300W OEM systems.

They also need Sub 120W cards that can match up to the 960. This is for the masses of sub 500W systems.

Lastly, they need good mobile versions that can at least match the Nvidia 8xxm and 9xxm parts on both perf/watt and price/perf.

Having a massive power draw / massive chip / unique water cooled solution for the top end model is nice for the <1% that use it, and an nice marketing bullet, but not really relevant. Nvidia can and will do the same type of thing, in a product that is equally irrelevant for 99.9% of people buying dGPUs.

If AMD continues down this path of selling bigger / less efficient / hotter / noisier parts for mid and high range gamers by just lowering their margins to appeal to perf / $, they are sunk. They did that for one cycle and lost the mobile market and a good chunk of the desktop market, while simultaneously losing their product margins. Now they need to either beat Maxwell at the same price or match it at a lower price with the R7/R9 3xx series, that's what it is to be the underdog.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Nvidia releases the overpriced Titans and people applaud. There is more to a halo product than just the 1% that buy it. Besides amd has very low cost high performing and competitive chips. What they don't have is the marketing might of Intel or nvidia.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I am not sure if the OP is talking about GPUs or CPUs or both. However, I will address dgpus. I think AMD is inconceviably making the same mistake in gpus that they made in cpus: Ignoring power consumption. Right now AMD desktop gpus are a great buy, but they only want to seem to focus on bringing out some 300 watt monster that will take the performance lead from nVidia. I know many on these forums applaud that, and deride power efficiency, but with the shift to mobile and even toward small form factors on the desktop, efficiency is ultimately where the market is moving. So I would not say it is exactly focusing too much on desktop, but failing to focus on efficiency. And it doesnt help that they lost a good portion of the laptop dgpu market due to poor implementation of enduro or whatever their switching method was called compared to optimus. And unless they make a major advance, the efficiency of Maxwell is pretty much going to lock them out of the laptop gpu market moving forward.

Agreed. Maybe AMD can re-focus its vision with the help of contact lenses: http://wtexas.com/content/15022044-contact-lens-developed-using-new-technology-could-help-amd
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Thats right. AMD is hindered by money mostly. Nvidia have a much bigger R&D team, and if I don`t remember wrong Nvidia wrote already in 2012 that they have recieved 20nm samples and were buisy working on the next architecture when Kepler was announced. There was some spectacle about Nvidia being angry because TSMC couldnt get a 20nm HP process up and running for their GPUs, but if they already nailed down the architecture part of Maxwell, they could probably do some small adjustments to get it out as fast as possible on 28nm HP. Thats why Nvidia had Maxwell out for a while now.
Since AMD have no new architecture that is ready yet due to development cycle, they can`t possible match Nvidia in mobile, since efficiency is everything in a strongly constrained thermal environment like you find in notebooks. Thats why they are virtually non existent on mobile right now.

Then its about allocating your available resources. AMD are deeply involved with APUs, CPUs and GPUs, Nvidia mostly just GPUs. So I think that also goes against AMD and widen their development cycle more for specific products. Its about finding balance, getting rid of dead weight (which is why they have started to back out of high end CPU race with Intel) and focusing on the products where they are successful

If ATI existed today, I think we would see a much more fierce competition between Nvidia and ATI and we would have more powerful GPUs than today.

Everything you've said here, and stated as an absolute fact, is really nothing but nVidia spin machine.

Virge was talking about AMD's conscience decision to go iGPU for mobile, not the size of their R&D budget.

From what has been said in the press it's not AMD not having something ready, it's the AIB's still have old chips to sell that's delaying the release of new products.

Really.

But i think Dirk Meyer and Rory Read did fairly good jobs there.

I'm of the opinion that Reid did nothing right.

Tahiti was released too low clocked and allowed GK104 to out perform it. Then it took months for AMD to react. By then the market was set that GK104 was a better chip and Tahiti had to be sold for less than GK104. Tahiti was released with full DP (1/3), something that nVidia made bank on with Titan to the tune of $1000 a card, but AMD couldn't leverage Tahiti's DP capability, or even it's overall compute performance, to even sell it for as much as GK104.

For a long time Pitcairn was easily the most efficient chip out. Did we hear about that the way we've heard about Maxwell's efficiency?

Hawaii was a disaster because of the reference cooler.

The mining craze meant that AMD could sell every chip they could get their hands on for full price but somehow they still lost market share and didn't even make a profit.

By the time they reacted to the mining market it was too late and now they've got massive inventory supplies they need to burn off.

What was it Rory did well, because I missed it? Everything that happened positive while he was there would have been put into place before he even got there. The consoles, Mantle, GCN, HBM, etc...
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I am not sure if the OP is talking about GPUs or CPUs or both. However, I will address dgpus. I think AMD is inconceviably making the same mistake in gpus that they made in cpus: Ignoring power consumption. Right now AMD desktop gpus are a great buy, but they only want to seem to focus on bringing out some 300 watt monster that will take the performance lead from nVidia. I know many on these forums applaud that, and deride power efficiency, but with the shift to mobile and even toward small form factors on the desktop, efficiency is ultimately where the market is moving. So I would not say it is exactly focusing too much on desktop, but failing to focus on efficiency. And it doesnt help that they lost a good portion of the laptop dgpu market due to poor implementation of enduro or whatever their switching method was called compared to optimus. And unless they make a major advance, the efficiency of Maxwell is pretty much going to lock them out of the laptop gpu market moving forward.

AMD is planning on iGPU for mobile. It's obviously the future in mobile. They just need to execute now.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
There is no such thing as a gaming laptop. Jacking the settings up at even 1080p will murder FPS way more than CPUs, plus you are at the mercy of the manufacturer for driver updates, plus you spend $3K on one now, in 6 months something will be substantially faster for the same price. Waste of money and effort.
 

cen1

Member
Apr 25, 2013
157
4
81
I am not sure if the OP is talking about GPUs or CPUs or both. However, I will address dgpus. I think AMD is inconceviably making the same mistake in gpus that they made in cpus: Ignoring power consumption. Right now AMD desktop gpus are a great buy, but they only want to seem to focus on bringing out some 300 watt monster that will take the performance lead from nVidia. I know many on these forums applaud that, and deride power efficiency, but with the shift to mobile and even toward small form factors on the desktop, efficiency is ultimately where the market is moving. So I would not say it is exactly focusing too much on desktop, but failing to focus on efficiency. And it doesnt help that they lost a good portion of the laptop dgpu market due to poor implementation of enduro or whatever their switching method was called compared to optimus. And unless they make a major advance, the efficiency of Maxwell is pretty much going to lock them out of the laptop gpu market moving forward.
It was mostly about GPUs but it's pretty much the same story on CPUs as you pointed out.

Agreed on all points.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
Everything you've said here, and stated as an absolute fact, is really nothing but nVidia spin machine.

Virge was talking about AMD's conscience decision to go iGPU for mobile, not the size of their R&D budget.

From what has been said in the press it's not AMD not having something ready, it's the AIB's still have old chips to sell that's delaying the release of new products.



I'm of the opinion that Reid did nothing right.

Tahiti was released too low clocked and allowed GK104 to out perform it. Then it took months for AMD to react. By then the market was set that GK104 was a better chip and Tahiti had to be sold for less than GK104. Tahiti was released with full DP (1/3), something that nVidia made bank on with Titan to the tune of $1000 a card, but AMD couldn't leverage Tahiti's DP capability, or even it's overall compute performance, to even sell it for as much as GK104.

For a long time Pitcairn was easily the most efficient chip out. Did we hear about that the way we've heard about Maxwell's efficiency?

Hawaii was a disaster because of the reference cooler.

The mining craze meant that AMD could sell every chip they could get their hands on for full price but somehow they still lost market share and didn't even make a profit.

By the time they reacted to the mining market it was too late and now they've got massive inventory supplies they need to burn off.

What was it Rory did well, because I missed it? Everything that happened positive while he was there would have been put into place before he even got there. The consoles, Mantle, GCN, HBM, etc...
Good post. Quoting it for effect.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Mountains out of molehills based on the efficiency of AMD's old high end GPU vs. NV's latest mid range GPU.

Let's revisit this in the summer.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
I am not sure if the OP is talking about GPUs or CPUs or both. However, I will address dgpus. I think AMD is inconceviably making the same mistake in gpus that they made in cpus: Ignoring power consumption. Right now AMD desktop gpus are a great buy, but they only want to seem to focus on bringing out some 300 watt monster that will take the performance lead from nVidia. I know many on these forums applaud that, and deride power efficiency, but with the shift to mobile and even toward small form factors on the desktop, efficiency is ultimately where the market is moving. So I would not say it is exactly focusing too much on desktop, but failing to focus on efficiency. And it doesnt help that they lost a good portion of the laptop dgpu market due to poor implementation of enduro or whatever their switching method was called compared to optimus. And unless they make a major advance, the efficiency of Maxwell is pretty much going to lock them out of the laptop gpu market moving forward.
Wrong. AMD is not releasing only one card with 300W. There certainly will be others, like there already are several cards on offer, and one of them draws about 300W, well at the least with the 2xx series. Nvidia also had a card or two last gen which were pretty damn close to power draw of 290x.

What you see right now is that Nvidia has launched their new mainstream Maxwell's, which draw lower power, but GM200 will not draw less than 200W like the 980 does, but it will be worse. Hell, i'd bet that it will based on clocks draw well over 250W, and closer to 300W as it was on higher end Kepler cards. Why do i think so? Well, the bus is wider, more memory, fatter caches, 50% more shaders, it will all not be free, and will of course increase the power intake a fair bit.

What then when the GM200 comes in the guise of Ti or some such, using more than 250W? Will there be such posts here as you have made now? Would you make the same comments about Nvidia? So yes, to be polite, comparing the power consumption of the current gen Nvidia cards with the AMD cards from before is downright dishonest. Ahem, to be more brutal is as akin to charging people for crossing a bridge.

By the way, you understand little of efficiency. Performance of the AMD card is reported to be 40% or more than 290x, while consuming merely 10W more. How can the performance see a jump on the same node, with about the same power, if it were not for an increase in efficiency, improved architecture etc.?
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Most companies that sell electronics see a revenue drop of 10-20% from Q4 to Q1. AMD won't be releasing any new products into the channel until Q2, so I expect a pretty bad Q1, followed by a slightly improved Q2, and a much better Q3/Q4 than last year.

All I can say is the numbers you showed mean very little to the point you were trying to make. AMD has a much bigger total addressable market than nVidia, therefore theoretically it would be much easier for AMD to double revenue than for nVidia to do so.

Eh AMD addresses many of the same markets as Nvidia. The ones they dont(CPU) is a low growth market dominated by Intel. The other(mobile) is a high growth market that AMD doesnt compete within. So where is AMD going to magically double their revenues in a market dominated by Intel? Nvidia has more upside due to Tegra. They find a few more design wins and it is growth. AMD would have to fight tooth and nail to win a % point from Intel or Nvidia in CPU or GPU markets and destroy what little profitability they have to do it.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Eh AMD addresses many of the same markets as Nvidia. The ones they dont(CPU) is a low growth market dominated by Intel. The other(mobile) is a high growth market that AMD doesnt compete within. So where is AMD going to magically double their revenues in a market dominated by Intel? Nvidia has more upside due to Tegra. They find a few more design wins and it is growth. AMD would have to fight tooth and nail to win a % point from Intel or Nvidia in CPU or GPU markets and destroy what little profitability they have to do it.

I think you are forgetting/ignoring their semi-custom SOC's
 
Aug 11, 2008
10,451
642
126
Wrong. AMD is not releasing only one card with 300W. There certainly will be others, like there already are several cards on offer, and one of them draws about 300W, well at the least with the 2xx series. Nvidia also had a card or two last gen which were pretty damn close to power draw of 290x.

What you see right now is that Nvidia has launched their new mainstream Maxwell's, which draw lower power, but GM200 will not draw less than 200W like the 980 does, but it will be worse. Hell, i'd bet that it will based on clocks draw well over 250W, and closer to 300W as it was on higher end Kepler cards. Why do i think so? Well, the bus is wider, more memory, fatter caches, 50% more shaders, it will all not be free, and will of course increase the power intake a fair bit.

What then when the GM200 comes in the guise of Ti or some such, using more than 250W? Will there be such posts here as you have made now? Would you make the same comments about Nvidia? So yes, to be polite, comparing the power consumption of the current gen Nvidia cards with the AMD cards from before is downright dishonest. Ahem, to be more brutal is as akin to charging people for crossing a bridge.

By the way, you understand little of efficiency. Performance of the AMD card is reported to be 40% or more than 290x, while consuming merely 10W more. How can the performance see a jump on the same node, with about the same power, if it were not for an increase in efficiency, improved architecture etc.?

Perhaps you are the one who needs to examine the definition of efficiency. It is performance per watt. If nVidia brings out a something or other Ti using more than 250 watts, that is OK; as long as the performance increase is commensurate with the increased power, efficiency is not changed. OTOH, if they can bring out a Ti model with the same performance of AMDs supposedly water cooled 300 watt monster with air cooling and a 250 watt or less TDP, it will still be more efficient.

Obviously if AMD comes out with a card 40 percent more powerful, at close to the same power, that would be a step in the right direction. I personally have not heard that rumor, only the forum posters here who tend to bash efficiency and clamor for the 300 watt monster. Perhaps it will come, but Maxwell has been out for quite a while now, and all we have from AMD still is rumors.
 
Aug 11, 2008
10,451
642
126
I think you are forgetting/ignoring their semi-custom SOC's

True, but by the time the next console cycle rolls around, I think they could well go ARM. Granted AMD will have an ARM chip, but I think they would be hard pressed to compete with the likes of Samsung et. al. if they were interested in doing the consoles.

If AMD were to lose the next round of console contracts, is could spell their death knell. Granted they supposedly have some semicustom wins in the works, but I doubt anything that will bring in the kind of revenue the consoles do.

Now before anyone gets upset, I certainly hope the consoles dont go ARM, because I think it would be terrible for PC gaming, but it certainly seems possible.
 

SimianR

Senior member
Mar 10, 2011
609
16
81
True, but by the time the next console cycle rolls around, I think they could well go ARM. Granted AMD will have an ARM chip, but I think they would be hard pressed to compete with the likes of Samsung et. al. if they were interested in doing the consoles.

If AMD were to lose the next round of console contracts, is could spell their death knell. Granted they supposedly have some semicustom wins in the works, but I doubt anything that will bring in the kind of revenue the consoles do.

Now before anyone gets upset, I certainly hope the consoles dont go ARM, because I think it would be terrible for PC gaming, but it certainly seems possible.

Keep in mind that AMD won the contract to supply the chips for Nintendo's next device (supposedly their next console) so they've at least got that. Hopefully they can secure deals for the next Sony/MS ones as well, but that's a long ways off.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Mountains out of molehills based on the efficiency of AMD's old high end GPU vs. NV's latest mid range GPU.

Let's revisit this in the summer.

This summer means Nvidia will have had a 9 month old mid-range chip out beating the pants off AMD, in which AMD is aiming to beat. Doesn't sound too good when you put it like that. When was the last time AMD had a performance lead with their mid-range chip over Nvidia's current best? And for 9 months? I don't think ever. Ouch.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I think you are forgetting/ignoring their semi-custom SOC's

Those are razor thing margin items if you are discussing consoles. Otherwise how large of a business does AMD generate with this sector? And how explosive is its potential growth?
 

caswow

Senior member
Sep 18, 2013
525
136
116
This summer means Nvidia will have had a 9 month old mid-range chip out beating the pants off AMD, in which AMD is aiming to beat. Doesn't sound too good when you put it like that. When was the last time AMD had a performance lead with their mid-range chip over Nvidia's current best? And for 9 months? I don't think ever. Ouch.

gk104 as well as tahiti were both mid range. since ghz edition was faster i think amd had a lead till titan hit the shelves. after generations amd is later than nvidia...how about just wait. just like people did when nivida was late. seems like its a different game for amd...
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Those are razor thing margin items if you are discussing consoles. Otherwise how large of a business does AMD generate with this sector? And how explosive is its potential growth?

TBH I can't remember what % of their business AMD plans the semi-custom SOC's to be, but it was sizable.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
Perhaps you are the one who needs to examine the definition of efficiency. It is performance per watt. If nVidia brings out a something or other Ti using more than 250 watts, that is OK; as long as the performance increase is commensurate with the increased power, efficiency is not changed. OTOH, if they can bring out a Ti model with the same performance of AMDs supposedly water cooled 300 watt monster with air cooling and a 250 watt or less TDP, it will still be more efficient.

Obviously if AMD comes out with a card 40 percent more powerful, at close to the same power, that would be a step in the right direction. I personally have not heard that rumor, only the forum posters here who tend to bash efficiency and clamor for the 300 watt monster. Perhaps it will come, but Maxwell has been out for quite a while now, and all we have from AMD still is rumors.
Must be real bad under the bridge where you are... what do you reckon 40% more performance at practically same TDP is? It is improved performance per watt. May be i should have spelled it for you. It is not the forum users here who came with the figure of 40%, but rather some at chiphell.

People are happy about better cooling solution on a card which can make use of it, so what's wrong with it? Oh, it is AMD that's providing a better reference cooling, not Nvidia.
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
AMD has made bad decisions and Nvidia got SoC wins from companies who have made bad decisions and could have ended up better with an AMD solution. For example, Windows tablets(not the high end, and surely WinRT) and the awful Nexus 9. Why did Microsoft and Google went for Nvidia? I have no clue, but they were horrible decisions/products.

AMD needs to stop trying to go after Intel and needs to show that they have products that have a very good and specific purpose. And they have them and have had them for awhile and yet, they aren't in the market.

What would have been better and cheaper from AMD?

Everything except super low watt systems or super high end cpu systems. Pretty much the majority of the market.



And really, when will Intel decide to kill off NV GPU usage? No one thinks of that yet? What is stopping Intel from releasing an Iris Pro dGPU? Nothing.
The only reason Intel hasn't gone straight for the mid-high mobile gpu's is because they want to improve the most in perf/w. GT4 96 EU's will get really close to the GTX 965m.
NV is in a corner and it is doing well, but to think that it will keep going like that is ignorance.


AMD is focusing in changing how things are made. Mantle or...the mass marketing of a low end API, True Audio...or the mass marketing of 3D audio, the push for Compute GPUs...or the push for more realistic games not prettier games and HSA...or the push towards your whole hardware being used for the software/game. And it has taken a long time to show. But, think about it. These are all major changes and pushing them to the masses. And there's Freesync, but that was motivated by NVidia.

AMD gets little to no credit at all. The "hardcore" should know better.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
AMD has made bad decisions and Nvidia got SoC wins from companies who have made bad decisions and could have ended up better with an AMD solution. For example, Windows tablets(not the high end, and surely WinRT) and the awful Nexus 9. Why did Microsoft and Google went for Nvidia? I have no clue, but they were horrible decisions/products.

AMD needs to stop trying to go after Intel and needs to show that they have products that have a very good and specific purpose. And they have them and have had them for awhile and yet, they aren't in the market.

What would have been better and cheaper from AMD?

Everything except super low watt systems or super high end cpu systems. Pretty much the majority of the market.



And really, when will Intel decide to kill off NV GPU usage? No one thinks of that yet? What is stopping Intel from releasing an Iris Pro dGPU? Nothing.
The only reason Intel hasn't gone straight for the mid-high mobile gpu's is because they want to improve the most in perf/w. GT4 96 EU's will get really close to the GTX 965m.
NV is in a corner and it is doing well, but to think that it will keep going like that is ignorance.


AMD is focusing in changing how things are made. Mantle or...the mass marketing of a low end API, True Audio...or the mass marketing of 3D audio, the push for Compute GPUs...or the push for more realistic games not prettier games and HSA...or the push towards your whole hardware being used for the software/game. And it has taken a long time to show. But, think about it. These are all major changes and pushing them to the masses. And there's Freesync, but that was motivated by NVidia.

AMD gets little to no credit at all. The "hardcore" should know better.

At the rate Intel is improving, they don't need to release a discrete card. They have Skylake coming out with >1TFlop graphics performance, and the edram will eventually give way to much higher bandwidth HBM. Unless someone can catch up to Intel in the foundry world, they'll just slowly eat more and more of the market for discrete graphics.
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
At the rate Intel is improving, they don't need to release a discrete card. They have Skylake coming out with >1TFlop graphics performance, and the edram will eventually give way to much higher bandwidth HBM. Unless someone can catch up to Intel in the foundry world, they'll just slowly eat more and more of the market for discrete graphics.

Exactly. And Intel is only taking longer because ARM became a threat. We could have had 96 EU's with Broadwell, 20% more with Skylake and x2 with Cannonlake. That would have put NV in a very bad position on 2016, but who knows, it could still happen.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
gk104 as well as tahiti were both mid range. since ghz edition was faster i think amd had a lead till titan hit the shelves. after generations amd is later than nvidia...how about just wait. just like people did when nivida was late. seems like its a different game for amd...

First of all, you are the very first person here, arguing in defense of AMD, to say Tahiti was "midrange." Indeed, if it was "midrange" then AMD was years and years late on getting their big die out but no one ever once said AMD was late with their big die. Tahiti was never considered midrange when released. You're obviously moving the goalposts, nice. Anyways, 7970GHZ and 680 were essentially tied. Any performance differences between the two were strictly bragging rights and only observable on graphs.

I say it again: there has never been a time in AMD's history when their mid-range next gen die beat Nvidia's last-gen fastest die. NEVER. And by the time Fiji comes out, Nvidia will have had GM204 out for 9 months, selling in notebooks and desktop computers and beating everything AMD had up until (hopefully) Fiji. Fiji may be great, but the damage being done right now is very real and going to be very hard to overcome. AMD's chips are more expensive to make and require more complex components, yet are selling for less and are selling in less volume. AMD doesn't have much money in the tank as it is.

Fiji needs to be amazing.
 
Last edited: