ATI overtakes nVidia in discrete graphics marketshare

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Scali

Banned
Dec 3, 2004
2,495
0
0
And you can call this a 'glitch' all you want, but I'd say it's far more than that. Nvidia was so very dominant in market share, how many times did we hear you shout about how Nvidia sells two part for every single AMD part (before you disappeared while Nvidia had absolutely nothing to show, now they have one good part so we're lucky enough to have you back)? From this two to one advantage they let their competitor outsell them. They had delay after delay and when the parts launched they failed to impress. Where are their lower end DX11 parts? This is one hell of a glitch.

If you go back further in history, you'll see that such 'glitches' occur, and markets can shift quickly.
Most obvious example is 3dfx. Completely dominated the 3d accelerator market. Then they had one 'glitch', and before you knew it, they were gone. From that point onwards, nVidia was the dominant force in the market.
But a few years later, nVidia had a 'glitch' with delivering DX9 hardware. Not only did ATi beat them to market by a few months, but ATi's chips were also considerably more powerful.
nVidia lost a lot of marketshare to ATi during the GeForce FX era. They slowly recovered with the 6000-series and then the 7000-series. Both were good products, arguably on par with ATi's offerings... but that's not enough.

nVidia didn't really return to dominance until the 8800 series. That's where ATi had their 'GeForce FX' moment, with a 'glitch' in their DX10 hardware. The HD2900 was too little, too late.
Like nVidia before them, ATi slowly recovered with the 3000-series and the 4000-series, but like with nVidia a few years earlier, they were good products, but not enough to regain dominance.

At this point in time it doesn't look like Fermi is such a big glitch as the GeForce FX or the Radeon HD2900 were. GF104 seems to be a nice recovery.

The GPU market is just a much faster mover than the CPU market for example.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Why would you need to produce a new arch on a new process? Intel, the undisputed king of RnD/Process/Manufacturing, didn't do that for SB, for example. It made Westmere on 32nm before Sandy Bridge on 32nm. And didn't Anand get around to explaining that already? AMD made the learning curve on the 4770, and what education they got from there they managed to apply to the 5xxx, which helped a lot, compared to nVidia's trials of only small, uncomplicated chips on 40nm.

Anyway, forget about that. In fact, forget about my entire first paragraph. Consumers and OEM's just dont care about invisible things like "architecture". They only care about tangibles like performance, power draw, TDP, silence/loudness, etc. Architecture is like a black box, we don't care about what goes on inside, what matters is what we get as a result of it.

I was not saying "architecture is not important". Rather, I was saying "new architecture" is not a feature by itself. More performance than last gen is a feature. Lower power draw is a feature. Even less audible fans can be a feature. But just saying "new arch", in the absence of any of the tangible features, is not a feature upon itself. It's not something you can use in isolation to judge two products. It's only better if the tangible features it offers are actually better.

That is why I quoted you and responded in the first place, because you agreed with an obvious trolling post. Architecture, by itself, means nothing if it does not result in the tangible features that are important. So saying "yeah, X company is behind" solely due to having older architecture (despite the obvious advantages in tangible features) is completely misguided. If it was under a specific context such as a GPGPU/compute card, then yeah, arch vs arch AMD is behind - by a lot. But the statement was made with no such context, and in that absence it has no good point at all.

As a developer myself, and a Linux user, I have a soft spot for nVidia. Things were painless with the 8600GTS I bought 3 years ago. With my newer 4770, there were some things I had to sacrifice (obviously, some nVidia specific features), and even some things that were Linux issues, not nVidia features - for example, I can't even play Battle for Wesnoth (and it's not even a 3D game), as it ends up corrupting the graphics driver and the screen is permanently color-inverted until I restart X. That's something that hasn't happened on my 8600GTS, and I went through several versions and distributions of Linux on it, all painless. I even used to run a 3D compositing desktop on it, yet another thing I had to sacrifice on my 4770 on Fedora 13 (thank you, AMD, for only supporting the latest Ubuntu releases, making my Fedora box unsupported due to using certain package versions that have not made it into Ubuntu yet).

But as a gaming card, I can't fault my 4770. And that's the issue at heart here. Yes, GPGPU and everything will take over the world someday, maybe. And yes, I love nVidia because it makes Linux computing experience easier. But the thread isn't about who has a better GPGPU (obviously, that's nVidia), or who has better Linux support (nVidia, no contest), or who loves devs more. The thread is about marketshare, and this is the domain of the OEMs and consumers, who both don't give a crap about anything that isn't one of the tangible features, and also don't care about Linux, and also don't care about what we enthusiasts think. And that's where AMD is ahead right now and have been for the better part of a year.

You should have just known better than to bite that troll post.

Jvroig, I agree with you said earlier: When someone fires up a game they don't care about uArch.

However, I am still concerned about ATI's jump to a new uArch (NI) on a new process (28nm) with a new fab (GF) next generation.

With that being said, how do you think ATI will negotiate this transition? Will they go for small GPUs and use technology like "sideport" and SFR drivers to make two smaller GPUs feel like a large single GPU?

Or will they launch an evergreen card on GF's 28nm in a fashion similar to the HD4770?
 
Last edited:

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Does this mean that Nvidia is in trouble? That they might go under or something?

I only buy Nvidia cards, have for years, so this would greatly affect me. I am a little worried now. AMD cards simply don't have the features I need, and that means I need to start hoarding some Nvidia cards if they are going under.

So um, is this a blip or might it take Nvidia under?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So um, is this a blip or might it take Nvidia under?

Remember that the mobile market is larger than the desktop market. So we would also need to look at sales of GPUs in the mobile space.

Furthermore, when it comes to Desktop graphics, NV shipped 91 percent of their product mix priced < $200. Only 3.8 percent of their sales came from $200-300 price bracket and 5.2 percent of their sales came from $300+ bracket. In addition, NV sold 5.7 million graphics card <$100 while ATI sold 3.6 million. This means that NV's profit margins are decreasing, but they are outselling ATI in both $0-100 and $200-$300 price categories. I wouldn't start calling the demise of NV. I think their sales will improve with the release of 450 and 460, both of which should help in regaining the lead in the $100-200 price bracket.

http://www.xbitlabs.com/news/video/...h_End_Mainstream_Graphics_Cards_Research.html

Both firms trade spots for #1 every 2-3 years or so.

Of course ATI is bound to answer with SI in late 2010.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Remember that the mobile market is larger than the desktop market. So we would also need to look at sales of GPUs in the mobile space.

Furthermore, when it comes to Desktop graphics, NV shipped 91 percent of their product mix priced < $200. Only 3.8 percent of their sales came from $200-300 price bracket and 5.2 percent of their sales came from $300+ bracket. In addition, NV sold 5.7 million graphics card <$100 while ATI sold 3.6 million. This means that NV's profit margins are decreasing, but they are outselling ATI in both $0-100 and $200-$300 price categories. I wouldn't start calling the demise of NV. I think their sales will improve with the release of 450 and 460, both of which should help in regaining the lead in the $100-200 price bracket.

http://www.xbitlabs.com/news/video/...h_End_Mainstream_Graphics_Cards_Research.html

Both firms trade spots for #1 every 2-3 years or so.

Of course ATI is bound to answer with SI in late 2010.

Good point.

AMD Fusion vs Intel CULV + Nvidia Optimus?

How will they trade blows in the future? Or how about an ATI version of Optimus for use with Intel?
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Intel's lockout is what first terrified me about Nvidia's future. Because of that move by Intel for the time being my HP Mini 311 is the best 11.6 inch Hackintosh I can find- I can't upgrade to awesome CULV models!

Well Nvidia still has ARM and with that possibly a much larger market than x86.
 
Last edited:

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
Remember that the mobile market is larger than the desktop market. So we would also need to look at sales of GPUs in the mobile space.
The mobile space is actually where AMD has better figures than NV, not the desktop, although the desktop graphics was the biggest gain for AMD. From earlier in this thread:
Discrete desktop graphics: AMD only has 44.5&#37; - but that's an 11-point gain, which came directly from nVidia's 11-point loss.

Discrete mobile graphics: AMD now has 56.3%, a result of a 2.4-point gain.

But this issue has been sensationalized to the point of ridiculousness. As of now, both competitors have a very healthy share of the market. Nobody is going under any time soon. This is nothing but a clear win for consumers. When somebody drops well below 30%, then maybe doom and gloom is warranted. When it continues to dip even further below 20%, then perhaps I might even join the doomsayers. But when the marketshare difference is just 2 points, I don't see what the sensationalism is all about.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Does this mean that Nvidia is in trouble? That they might go under or something?

I only buy Nvidia cards, have for years, so this would greatly affect me. I am a little worried now. AMD cards simply don't have the features I need, and that means I need to start hoarding some Nvidia cards if they are going under.

So um, is this a blip or might it take Nvidia under?

Either company could go under. Is a shrinking market in uncertain times.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I don't see Fusion becoming a big success in the next 5-10 years.
There is a fundamental memory bottleneck problem when integrating a CPU and GPU in the same package and sharing the memory controller + memory modules.

Aside from that, there are fundamental problems with combining a high-performance CPU with a high-performance GPU in a single package.

Therefore, I don't see how Fusion could become a threat to high-end discrete GPGPU solutions. It's interesting as a budget solution, but nothing more.

Scali,

1. Wouldn't tessellation (low quality models tessellated to high quality models) help reduce the need for dedicated GDDR memory bandwidth?

2. Regarding GPGPU, I know you have mentioned high end cards. I also realize Adobe makes nice niche products in the CS suite, but what about Nvidia's GPGPU presence in the mass market (ie, budget solution).
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
1. Wouldn't tessellation (low quality models tessellated to high quality models) help reduce the need for dedicated GDDR memory bandwidth?

Yup, you save both memory space and bandwidth, at the cost of more arithmetic power. In a way it's a form of data compression.
There are also other benefits, eg you also get better antialiasing on tesellated geometry than on texture-based solutions.

2. Regarding GPGPU, I know you have mentioned high end cards. I also realize Adobe makes nice niche products in the CS suite, but what about Nvidia's GPGPU presence in the mass market (ie, budget solution).

Well, high-end GPUs already ARE a budget solution :)
If you compare the cost of a Tesla card to a supercomputer with similar processing power, the Tesla is a steal.
I don't really see Fusion operating in that market because you'd have the same problem as regular CPUs competing with Tesla today: You need a lot more of them to get the same processing power, which leads to much larger systems, much more expensive motherboards, more power consumption etc.

As I said, Fusion is interesting as a budget solution... It would offer you a budget CPU (Athlon quadcore-ish performance) and a budget GPU (5470-5570-ish performance) in a smaller form factor at a lower price. It doesn't offer you anything new. It's just the same, only cheaper.
Much like Intel and its CPUs with integrated GPUs... they're not really better than other IGP solutions, but they're cheaper.
The problem here is that if the GPU part is too slow, it will not be interesting to try and use it as a GPGPU. It may not be faster than the CPU itself.

But that's not really the market for GPGPU. There is no mainstream GPGPU yet. It's mostly Adobe products and 3d rendering. But those people already bought high-end systems anyway, because they needed fast CPUs and GPUs with enough memory for their demanding software. Buying a high-end GPGPU card to accelerate this doesn't really make a difference to them. They bought high-end stuff anyway.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
Does this mean that Nvidia is in trouble? That they might go under or something?

I only buy Nvidia cards, have for years, so this would greatly affect me. I am a little worried now. AMD cards simply don't have the features I need, and that means I need to start hoarding some Nvidia cards if they are going under.

So um, is this a blip or might it take Nvidia under?

If they are to go under it wont be because of having a market share of just a hair below 50 % .

But, it might make even more difficult for NVIDIA to press their own proprietary technologies like physX.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
The problem here is that if the GPU part is too slow, it will not be interesting to try and use it as a GPGPU. It may not be faster than the CPU itself.

Thanks for shedding some light on the situation.

Hmmm.....so Nvidia would need to pair a fairly large GPU coupled to something like ARM in order for GPGPU to take off at the mobile level?

As far as AMD goes, they would need to scale back the number of CPU cores on their fusion chips and proportionally increase their GPU die space. (Yeah, even I can see where this starts to cause trouble).

I wonder where the breaking point would be? Maybe one Bulldozer module (ie, dual core) and the equivalent of 800 stream processors? Or would the ratio have to be even more biased towards the GPU? (with all this video processing power running through system memory nonetheless)

Wow, well if that is true then it looks like both Nvidia and AMD have a lot of hurdles to overcome if expanding GPGPU is the goal.
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
Hmmm.....so Nvidia would need to pair a fairly large GPU coupled to something like ARM in order for GPGPU to take off at the mobile level?

Well, ARM is a bit different from Fusion, since x86 is generally a lot faster (even an Atom is faster than most ARMs, and AMD's Fusion processors are faster than Atoms). So your GPU doesn't need to be that large in order to have significant performance gains over the CPU.

On the other hand, it's all relative. ARM processors are smaller than x86 processors aswell, so even something like the Ion IGP is already 'fairly large' compared to an ARM.

It's difficult to say where the right balance is.
I just think that GPGPU has to come in from the high-end, not from the budget or mobile end.
The technology needs to be adopted first, and that's easier when you have good performance parts, than when you are trying to fit small form factor and power consumption requirements.

GPGPU is now reasonably common in scientific computing, and it is starting to trickle down into the desktop market.
But I'm not really sure what mobile devices would want with GPGPU at this point. You're not going to run PhotoShop or anything on your phone or iPad, I think.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
I think their sales will improve with the release of 450 and 460, both of which should help in regaining the lead in the $100-200 price bracket.

http://www.xbitlabs.com/news/video/...h_End_Mainstream_Graphics_Cards_Research.html

Both firms trade spots for #1 every 2-3 years or so.

Of course ATI is bound to answer with SI in late 2010.

But that was because of the channel stuffing that nVidia were doing when they launched the GTX series, remember the "Buy 3 GTS 250 and 7 G210/GT220 to get a single GTX 480 crap" Because AFAIK, nVidia doesn't have anything below the $200 price point that's competitive with AMD.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Well Nvidia still has ARM and with that possibly a much larger market than x86.

But a lot of important code is x86 only.

Is it true that Nvidia might buy Via to get at their x86 license? For everything I love Nvidia for, I HATE Via for, so the combination of the two kinda scares me. They are like black and white to me...
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Is it true that Nvidia might buy Via to get at their x86 license?

I don't think that would even be possible, legally. The x86 licensing terms are very specific about not being transferable to a third party.
Even AMD-GF had some legal problems with the split-up. They probably got away with it as part of the Intel-AMD settlement, where some of the x86-licensing terms were loosened.
 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
I don't think that would even be possible, legally. The x86 licensing terms are very specific about not being transferable to a third party.

True, Via cannot sell or transfer their x86 license. If they get bought out their X86 license becomes invalid.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I don't think that would even be possible, legally. The x86 licensing terms are very specific about not being transferable to a third party.
Even AMD-GF had some legal problems with the split-up. They probably got away with it as part of the Intel-AMD settlement, where some of the x86-licensing terms were loosened.

So, what if Via and nVidia merged?
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Well, ARM is a bit different from Fusion, since x86 is generally a lot faster (even an Atom is faster than most ARMs, and AMD's Fusion processors are faster than Atoms). So your GPU doesn't need to be that large in order to have significant performance gains over the CPU.

On the other hand, it's all relative. ARM processors are smaller than x86 processors aswell, so even something like the Ion IGP is already 'fairly large' compared to an ARM.

It's difficult to say where the right balance is.
I just think that GPGPU has to come in from the high-end, not from the budget or mobile end.
The technology needs to be adopted first, and that's easier when you have good performance parts, than when you are trying to fit small form factor and power consumption requirements.

GPGPU is now reasonably common in scientific computing, and it is starting to trickle down into the desktop market.
But I'm not really sure what mobile devices would want with GPGPU at this point. You're not going to run PhotoShop or anything on your phone or iPad, I think.

Imaganation Tech does really well in the ARM space and they have a good product for that space. When they go to the 28 process along with everyone else they will have a great product in the Arm space.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Imaganation Tech does really well in the ARM space and they have a good product for that space. When they go to the 28 process along with everyone else they will have a great product in the Arm space.

According to an anonymous poster in CUDA Zone IMG Power VR SGX is Open CL capable, but is that company at the point where they have products capable of 1080p playback?

I'd imagine that would really help smartbooks compete against Netbooks (especially considering both Intel Cedar Trail and AMD Ontario will be capable of that feat). (although a big caveat might be the status of Windows 7 starter and external monitors. However something tells me MS will probably remove that restriction if they haven't already done so).
 
Last edited: