[TPU]Big Swing in Market Share From AMD to NVIDIA: JPR

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

desprado

Golden Member
Jul 16, 2013
1,645
0
0
I'm not surprised to see people going to Nvidia. I feel like I'm getting cornered into buying a 970.

AC: Unity, FC4, MGS: Ground Zeroes/TPP, Witcher 3, Nvidia recruited them all and the titles have features that are either exclusive to Nvidia or perform better on Nvidia cards. Not sure what's going on with GTA V but if they keep recruiting games of this caliber, I can only guess how many 7xxx/2xx owners will migrate by next summer. The price drops aren't enough, AMD needs superb performance in the upcoming blockbusters to slow down the bleeding and they need to beat Maxwell soon to recover.

U forget to mention Evolve ,Dying Light,Project Cars, Crew and Batman Arkham Knight as Nvidia Gameworks title.
 

NTMBK

Lifer
Nov 14, 2011
10,487
5,911
136
- The mobile market puts a premium on performance/watt and battery life, with the iGPU being a not so relevant factor for price composition of the CPU in the supply chain. Given that AMD strength is on the iGPU and it sorely lacks both performance/watt and battery life, it must go to the bottom market but they can't fight there because of the big die.

Ridiculous. The biggest improvement in laptops in the last few years has been the arrival of high DPI displays, and the banishing of 768p to budget craptops. Significantly improved GPU performance is essential to an acceptable laptop experience these days- same reason that Intel is improving the GPU portion of their APUs much faster than the CPU portion.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Ridiculous. The biggest improvement in laptops in the last few years has been the arrival of high DPI displays, and the banishing of 768p to budget craptops. Significantly improved GPU performance is essential to an acceptable laptop experience these days- same reason that Intel is improving the GPU portion of their APUs much faster than the CPU portion.

The most expensive parts aren't king of hill iGPU notebooks, but Ultrabooks, and the guys needing high GPU power have to go discrete with mobile workstations. So I respectfully disagree with your opinion.

I think AMD went overboard with the iGPU power of their mobile chips, especially for the market bracket where they can sell it (bottom of the barrel).
 

Fire&Blood

Platinum Member
Jan 13, 2009
2,333
18
81
Do they? Or will crippling nv performance be enough?:cool:

Neither camp dares IMO. Now it's done without taking the gloves off, via Mantle, G Sync, TXAA, Shadowplay...


U forget to mention Evolve ,Dying Light,Project Cars, Crew and Batman Arkham Knight as Nvidia Gameworks title.

True though Witcher 3, GTA 5 and MGS 5 get more attention. I'm putting up with Unity for the moment but if the 3 I just mentioned have a clear advantage on NV, I'm not going to wait much longer, I'm picking up a EVGA 970.
 

NTMBK

Lifer
Nov 14, 2011
10,487
5,911
136
The most expensive parts aren't king of hill iGPU notebooks, but Ultrabooks, and the guys needing high GPU power have to go discrete with mobile workstations. So I respectfully disagree with your opinion.

I think AMD went overboard with the iGPU power of their mobile chips, especially for the market bracket where they can sell it (bottom of the barrel).

And yet what does the die for those Ultrabooks look like? Oh right, over half of it is devoted to graphics.

intel-core-m-broadwell-y-die-diagram-map.jpg


I would say that the Kaveri CPU/GPU balance is about right for a laptop frankly. The issue is that their CPU architecture really isn't power-usage competitive, and they are using a completely uncompetitive node compared to Intel's manufacturing prowess.

They also need to bring out a dedicated 2-core, half sized GPU die, so that they can actually be price competitive in low power situations. Having literally half of the die fused off is just ridiculous, and has got to be hurting their margins pretty badly.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
And yet what does the die for those Ultrabooks look like? Oh right, over half of it is devoted to graphics.

Intel by the virtue of their manufacturing advantage can spend 50% of its die with iGPU and still stay smaller than AMD. It is something that AMD simply cannot afford because of its foundry partners. Core M is about the size of Bobcat, but yet much more powerful.

But just look at how AMD is such a mismanaged company. Intel had both the margin to burn and the manufacturing node to go 50% of the die for the iGPU, but yet they didn't go for it until 14nm. AMD, OTOH, is going 50% of the die since 32nm, but they have neither the node advantage and nor the gross margin to burn. The results are in their financials for everyone to see.

I would say that the Kaveri CPU/GPU balance is about right for a laptop frankly. The issue is that their CPU architecture really isn't power-usage competitive, and they are using a completely uncompetitive node compared to Intel's manufacturing prowess.

I don't think Kaveri is the right balance except when talking about budget gamers. It is too much for content consumption and way too much for productivity. Only the most cash strapped gamer can benefit of the iGPU, and given AMD lack of scale, sometimes it is cheaper to go Intel + Nvidia.

They also need to bring out a dedicated 2-core, half sized GPU die, so that they can actually be price competitive in low power situations. Having literally half of the die fused off is just ridiculous, and has got to be hurting their margins pretty badly.

And what should they do with the leftovers of their 4C die?
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,487
5,911
136
Intel by the virtue of their manufacturing advantage can spend 50% of its die with iGPU and still stay smaller than AMD. It is something that AMD simply cannot afford because of its foundry partners. Core M is about the size of Bobcat, but yet much more powerful.

Oh, definitely true. AMD's manufacturing disadvantage is a massive problem at this stage, and their margins have got to be tanking ridiculously. There's a reason why Excavator is moving to High Density Libraries, and supposedly halving the L2$ sizes- they knew that they would have to fight 14nm with a 28nm part, and are trying to do what they can with a bad hand. It's going to be another lean year for AMD's CPU business, I expect.

(Core M is still a little bigger than Bobcat btw. But it's still damn impressive!)

I don't think Kaveri is the right balance except when talking about budget gamers. It is too much for content consumption and way too much for productivity. Only the most cash strapped gamer can benefit of the iGPU, and given AMD lack of scale, sometimes it is cheaper to go Intel + Nvidia.

That's from a desktop point of view, though, not a laptop point of view. I agree for the most part on the desktop, though there are a few niches where it is viable. But in mobile, where the GPU will be running at much lower clocks (~500MHz) with higher shader counts in order to maximise perf/W, it doesn't seem like "too much" performance. It seems like the right amount of GPU performance to drive a >=1080p laptop screen at acceptable performance levels. (You also aren't hit by the bandwidth bottleneck as hard as you are on the desktop.)

And what should they do with the leftovers of their 4C die?

I assume you mean ones which have faults in one/two of their cores, and are die harvested down? I don't know, I would hope yields would be good enough that it's not necessary. Obviously that's the approach Intel takes (different dies for different SKUs), though I don't know how yields compare on GloFo 28nm vs. Intel 14nm. I would have thought that the yield improvements from a 50% smaller die for the 2C parts would offset the inability to harvest a handful of failed-but-still-functional-as-a-2C dies from the 4C range. Though obviously I haven't run any sums :)
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Oh, definitely true. AMD's manufacturing disadvantage is a massive problem at this stage, and their margins have got to be tanking ridiculously. There's a reason why Excavator is moving to High Density Libraries, and supposedly halving the L2$ sizes- they knew that they would have to fight 14nm with a 28nm part, and are trying to do what they can with a bad hand. It's going to be another lean year for AMD's CPU business, I expect.

The manufacturing disadvantage was a problem before, and yet AMD doubled down its bed on massively big SKUs. I really cannot understand this until today, this was a really mind boggling move. The only plausible reason that comes to my mind is that they thought they could charge a significant price premium for their iGPU (the fact that they killed high volume, low margin bottom dGPU parts corroborates on this, but still...).

That's from a desktop point of view, though, not a laptop point of view. I agree for the most part on the desktop, though there are a few niches where it is viable. But in mobile, where the GPU will be running at much lower clocks (~500MHz) with higher shader counts in order to maximise perf/W, it doesn't seem like "too much" performance. It seems like the right amount of GPU performance to drive a >=1080p laptop screen at acceptable performance levels. (You also aren't hit by the bandwidth bottleneck as hard as you are on the desktop.)

You can run a 1080p display with 384SPs, and if you don't care too much about power consumption even 256SP at higher clocks.

I assume you mean ones which have faults in one/two of their cores, and are die harvested down? I don't know, I would hope yields would be good enough that it's not necessary. Obviously that's the approach Intel takes (different dies for different SKUs), though I don't know how yields compare on GloFo 28nm vs. Intel 14nm. I would have thought that the yield improvements from a 50% smaller die for the 2C parts would offset the inability to harvest a handful of failed-but-still-functional-as-a-2C dies from the 4C range. Though obviously I haven't run any sums :)

What kind of performance AMD would be able to get with 2C parts? Certainly not enough to compete against a Celeron, but far worse, it wouldn't be able to compete against its own cat cores, so we're looking at an unmitigated failure on steroids. On top of that it would further erode the cost structure of the 4C parts, because the 2C parts would go to the trash can. Only losses on this move.
 

NTMBK

Lifer
Nov 14, 2011
10,487
5,911
136
The manufacturing disadvantage was a problem before, and yet AMD doubled down its bed on massively big SKUs. I really cannot understand this until today, this was a really mind boggling move. The only plausible reason that comes to my mind is that they thought they could charge a significant price premium for their iGPU (the fact that they killed high volume, low margin bottom dGPU parts corroborates on this, but still...).

The first APUs launched on 32nm while Intel was also still on 32nm- Llano was up against Sandy Bridge, and had a similar 50/50 split between CPU and GPU.

49142A_LlanoDie_StraightBlack.jpg


Llano almost made it into the Macbook, but AMD's supply issues (terrible GloFo yields) killed that deal.

You can run a 1080p display with 384SPs, and if you don't care too much about power consumption even 256SP at higher clocks.

Depends what you want to do with it- for gaming, 384 doesn't really cut it at 1080p. And then at higher resolutions (like the 4k resolutions quite a few laptops are appearing with) you need a lot more GPU just to drive the system satisfactorily for productivity use.

What kind of performance AMD would be able to get with 2C parts? Certainly not enough to compete against a Celeron, but far worse, it wouldn't be able to compete against its own cat cores, so we're looking at an unmitigated failure on steroids. On top of that it would further erode the cost structure of the 4C parts, because the 2C parts would go to the trash can. Only losses on this move.

This is certainly true, but again this is more to do with their poor CPU architecture as opposed to the basic design. You may be onto something with cat cores, given that Nolan/Amur are getting dual channel memory support, Catamount cores and hopefully higher clocks from the 20nm process- they may well edge up into higher performance levels and eat away at the "big core" market some more.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
What kind of performance AMD would be able to get with 2C parts? Certainly not enough to compete against a Celeron, but far worse, it wouldn't be able to compete against its own cat cores, so we're looking at an unmitigated failure on steroids. On top of that it would further erode the cost structure of the 4C parts, because the 2C parts would go to the trash can. Only losses on this move.

I'm trying to keep up with you guys...

There are 2C parts sold now. It good sku for a media box, some flash games, generall facebook, youtube and e-mail tasks.

These are sold for peanuts and system builders seems to like it due to easy misleading marketing - it haz radeonz in it!
 
Last edited:

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
The first APUs launched on 32nm while Intel was also still on 32nm- Llano was up against Sandy Bridge, and had a similar 50/50 split between CPU and GPU.

That's what I was talking. AMD did at 32nm what Intel would be confident to do only at 14nm.

Depends what you want to do with it- for gaming, 384 doesn't really cut it at 1080p. And then at higher resolutions (like the 4k resolutions quite a few laptops are appearing with) you need a lot more GPU just to drive the system satisfactorily for productivity use.

Forget gaming. The majority of laptop buyers don't game, neither do the bulk of corporate customers. This is what made Kaveri a bad product: By going wild on the SP count AMD trapped itself into a niche: Too expensive for the mainstream, non-gaming market, too weak for the dedicated mobile gaming market.

This is certainly true, but again this is more to do with their poor CPU architecture as opposed to the basic design. You may be onto something with cat cores, given that Nolan/Amur are getting dual channel memory support, Catamount cores and hopefully higher clocks from the 20nm process- they may well edge up into higher performance levels and eat away at the "big core" market some more.

That move is long overdue, especially because the "big core" line is really zombified.
 

NTMBK

Lifer
Nov 14, 2011
10,487
5,911
136
Forget gaming. The majority of laptop buyers don't game, neither do the bulk of corporate customers. This is what made Kaveri a bad product: By going wild on the SP count AMD trapped itself into a niche: Too expensive for the mainstream, non-gaming market, too weak for the dedicated mobile gaming market.

True. I'm honestly amazed that AMD haven't tried pushing Firepro Kaveri more aggressively. Professional grade OpenGL drivers in a non-dGPU laptop seems like an amazing offering for business markets, and would seriously undercut the low end Quadro offerings- the likes of the K1100M and lower would be basically redundant. Plenty of professional tasks value OpenGL correctness, even when they are not especially performance sensitive, and while AMD are behind NVidia in their professional drivers they are still worlds ahead of Intel.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
True. I'm honestly amazed that AMD haven't tried pushing Firepro Kaveri more aggressively. Professional grade OpenGL drivers in a non-dGPU laptop seems like an amazing offering for business markets, and would seriously undercut the low end Quadro offerings- the likes of the K1100M and lower would be basically redundant.

Me too. The APU indeed offers an interesting value proposition in some scenarios. If anything, they could get students or universities to buy these systems.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
I'm trying to keep up with you guys...

There are 2C parts sold now. It good sku for a media box, some flash games, generall facebook, youtube and e-mail tasks.

These are sold for peanuts and system builders seems to like it due to easy misleading marketing - it haz radeonz in it!

And how do these parts fare against AMD cat core line?
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
True. I'm honestly amazed that AMD haven't tried pushing Firepro Kaveri more aggressively. Professional grade OpenGL drivers in a non-dGPU laptop seems like an amazing offering for business markets, and would seriously undercut the low end Quadro offerings- the likes of the K1100M and lower would be basically redundant. Plenty of professional tasks value OpenGL correctness, even when they are not especially performance sensitive, and while AMD are behind NVidia in their professional drivers they are still worlds ahead of Intel.

The pro market requires good drivers. Even on desktop as far as drivers go its nvidia > amd > intel. In that if you report a bug nvidia might actually reply and try to fix it. Amd bug reports basically get ignored but their drivers are still ahead of intel. Hence for business if you don't care about graphics/compute just get an intel machine, if you do make sure it's got an nvidia chip. Better to get overcharged by nvidia and have a working solution then save a few $ up front only to have to waste many times that because the software just doesn't work properly.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Me too. The APU indeed offers an interesting value proposition in some scenarios. If anything, they could get students or universities to buy these systems.

I've said this time and time again. AMD's APU really had very few target markets and the markets it COULD hit it managed to fail with completely or not market to.

The back to school/college kids are the EASIEST market for an AMD APU that is capable of gaming.

Most kids aren't going to get a $1k+ Gaming laptop for college (meaning the laptop won't have a GPU most likely).

However AMD has their APU in all 3 consoles. All they had to do was market correctly and state that "AMD APU is in PS4/ Xbox One/ Wii U. You're going back to school and you want to be able to play all your favorite games? Get a laptop with an AMD APU inside. Be able to play all your favorite games on the go... blah blah blah."

I just don't get their marketing team why don't they try?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
MGS is confirmed
GTA V not confirmed but there is an article on NV's site so probably it's another Gameworks title.

As I said in other threads, I would REALLY hate if PC gaming turned into a game of who throws more money at and provides closed game code to developers. Unfortunately, NV is not going to stop this and if AMD doesn't do the same aggressively, they will get run over. You cannot overcome an inherent 30-100% disadvantage like in Unity when the game code is not optimized for your architecture. I really have no answer for how to make game development fair other than for developers to stop taking bribes. I don't see how AMD can keep up as NV simply has more resources and engineers to throw at GW. I honestly don't remember in the last 15 years of gaming the situation getting this bad as it has gotten in the last 3-4 years. In the past if you bought a 6800U or X800XT, for the most part the performance was good across 90% of games. This is not the case at all today.

I said at one time we will either have to choose a GPU optimized for a list of specific manufacurer's sponsored games OR if you play a wide variety of games, soon you might need to run both AMD and NV.
 

NTMBK

Lifer
Nov 14, 2011
10,487
5,911
136
I said at one time we will either have to choose a GPU optimized for a list of specific manufacurer's sponsored games OR if you play a wide variety of games, soon you might need to run both AMD and NV.

Or run Intel integrated and get screwed by both of them :awe:
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
The fault here is not AMDs or NVIDIAs, the fault is ours. We as gamers have to stop pre-ordering games. Wait for the game to release, read the reviews and then IF the game is finished and you like it then buy it. If not wait for the patches and new GPU drivers to be released and then IF you are still interested then buy it.

This way we will make the developers never to release an unfinished Game again. The last pre-order i made was BF4, that was the last time i made a pre-order. And it seams that the vast majority of games released since then are unfinished products with massive pre-orders.

Dont blame the others, do what you SHOULD do as a gamer, buy finished products only ;)
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
The fault here is not AMDs or NVIDIAs, the fault is ours. We as gamers have to stop pre-ordering games. Wait for the game to release, read the reviews and then IF the game is finished and you like it then buy it. If not wait for the patches and new GPU drivers to be released and then IF you are still interested then buy it.

This way we will make the developers never to release an unfinished Game again. The last pre-order i made was BF4, that was the last time i made a pre-order. And it seams that the vast majority of games released since then are unfinished products with massive pre-orders.

Dont blame the others, do what you SHOULD do as a gamer, buy finished products only ;)

Review sites would have to wait as well to make the reviews. So early access for x GPU maker dont screw the results. Plus the usual patch and fixes for the bad overall performance cases.

But else its entirely true. Vote with your wallet. People just forget that or hope someone else voted for them...
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Review sites would have to wait as well to make the reviews. So early access for x GPU maker dont screw the results. Plus the usual patch and fixes for the bad overall performance cases.

But else its entirely true. Vote with your wallet. People just forget that or hope someone else voted for them...

I agree, review sites should also do a re-evaluation of the same game a few months later, if they want/need to do a review at the time of release.