• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

AMD Q1 2015 Earnings - 23 cents a share loss, to exit dense server (SeaMicro)

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
No, its just good enough or above. But nobody guys IGPs for their performance.

It should be quite clear from the market.

You've been saying the iGPU is "good enough" for the past 5 years. During that time Intel has increased the percentage of the die area allocated to the iGPU on the Core CPUs by several hundred percent!

So what exactly is your definition of "good enough"? Is it the exact percentage of the die area that Intel happen to allocate to the iGPU for every specific CPU generation?

Clearly Intel has been prioritizing the iGPU for the last few CPU generations. Why would they do that if it was already good enough several CPU generations ago? Remember the GFX workload has not increased that much during this time, if your an average used doing office work or surfing the web.
 
Better double check the M&A success rate of the companies you do own. I was reading some HBR books a while back and cited a high majority of M&A (75%?) failed to return what the company was expecting.

Yes, I would not be surprised if that is the case. I wonder if it's because the CEOs themselves get a higher salary due to the company growing that way (salary based on number or employees, market cap, and similar). Or maybe they are like generals wanting to conquer ever more land.

But since the outcome is often unsuccessful as you say, the stock owners really should be more wary of M&A's. I do not hold enough stock in any company to influence that myself though unfortunately... 😉
 
Yes, I would not be surprised if that is the case. I wonder if it's because the CEOs themselves get a higher salary due to the company growing that way (salary based on number or employees, market cap, and similar). Or maybe they are like generals wanting to conquer ever more land.

But since the outcome is often unsuccessful as you say, the stock owners really should be more wary of M&A's. I do not hold enough stock in any company to influence that myself though unfortunately... 😉

If you own the companies getting acquired then no wariness required 🙂
 
You've been saying the iGPU is "good enough" for the past 5 years. During that time Intel has increased the percentage of the die area allocated to the iGPU on the Core CPUs by several hundred percent!

So what exactly is your definition of "good enough"? Is it the exact percentage of the die area that Intel happen to allocate to the iGPU for every specific CPU generation?

Clearly Intel has been prioritizing the iGPU for the last few CPU generations. Why would they do that if it was already good enough several CPU generations ago? Remember the GFX workload has not increased that much during this time, if your an average used doing office work or surfing the web.

The increase is simply an offset of slowly increasing demands and the ever stagnation of software multithreading. Leaving 2-4 cores and the rest up for grabs. Hence we got dedicated quicksync blocks and what else.

So yes, its good enough. And no, GPU workload have increased.

The market believes so as well. Else AMDs APU endevour wouldnt have been a complete disaster as it is. Funny enough, AMDs best selling APUs are also the slowest graphics wise. Consider that for a moment.

Not sure why you got so much against Intel, while you keep defending some of the worst business decisions in the industry by AMD.
 
Last edited:
The market believes so as well. Else AMDs APU endevour wouldnt have been a complete disaster as it is. Funny enough, AMDs best selling APUs are also the slowest graphics wise. Consider that for a moment.
.

#1 CEO of 2016 for sure. AMD should disable more igp on their apus and they will sell more of them this way.
Don't you ever be my boss, please.
 
It's obvious AMD's acquisition of ATI carried a higher risk, mostly because AMD was a smaller company given the size of the deal. But then the potential reward was also much higher. Nobody could know beforehand how it would pan out. But since AMD was a smaller company, it could very well have tripled the value of the company if successful.

Nobody could know how the M&A would pan out, but a lot of people could think of deal structures that didn't wreck AMD balance sheet they way it did. That's risk management, the discipline you refuse to acknowledge the existence, and that's what makes AMD idea of acquiring ATI the way it did a very, very dumb idea.

As for Intel's acquisition of McAfee, it carried far less risk. But the potential reward was also much lower, and it certainly had no potential of tripling the value of the company or anything even close to that. In fact I don't see how they ever could count on getting any sensible ROI from the $7.1B regardless of the outcome. So it was was a bad deal no matter what.

Since Intel was already big it didn't have to triple the size of the company, the acquisition just had to pay for itself (which it didn't), but I'm really curious for one thing: What Intel had to deliver was bigger than WACC ROI for the acquisition to pay for itself, and not triple the size of the company or anything like that, but it seems that you are coming with another metric: Sensible ROI. What's the definition for Fjodor's metric of sensible ROI and what Fjodor's sensible ROI would mean for Intel:
 
Last edited:
#1 CEO of 2016 for sure. AMD should disable more igp on their apus and they will sell more of them this way.
Don't you ever be my boss, please.

Imagine AMD selling 180mm2 dies instead of 240mm2. And not having to throttle so extensively under load of both parts.

But again, its much better to sell 240mm2 for the same price anyway, right? Because that extra IGP part is really paying the bill and certainly not one of the reasons why AMD lose more than needed money.
 
You've been saying the iGPU is "good enough" for the past 5 years. During that time Intel has increased the percentage of the die area allocated to the iGPU on the Core CPUs by several hundred percent!

So what exactly is your definition of "good enough"? Is it the exact percentage of the die area that Intel happen to allocate to the iGPU for every specific CPU generation?

Clearly Intel has been prioritizing the iGPU for the last few CPU generations. Why would they do that if it was already good enough several CPU generations ago? Remember the GFX workload has not increased that much during this time, if your an average used doing office work or surfing the web.

AMD & Nvidia's bottom of the line dGPU's have also been improving over time.

Even low end parts get better with successive generations.
 
Imagine AMD selling 180mm2 dies instead of 240mm2. And not having to throttle so extensively under load of both parts.

But again, its much better to sell 240mm2 for the same price anyway, right? Because that extra IGP part is really paying the bill and certainly not one of the reasons why AMD lose more than needed money.
its is take or pay, big dice don't really matter for amd because they have the wsa looming over them.
 
its is take or pay, big dice don't really matter for amd because they have the wsa looming over them.

I notice that WSA keeps getting renegotiated though.

Then there is the idea that bringing the dGPU over to GF will reduce need for large iGPUs to fill out the remaining wafers.
 
I notice that WSA keeps getting renegotiated though.

Then there is the idea that bringing the dGPU over to GF will reduce need for large iGPUs to fill out the remaining wafers.

The console chips are more than enough to fulfill the WSA quota and given the legacy node it is perfect suitable to that subpar foundry of them.
 
Right. But now you're arguing over something I've not been discussing!

They have? Then please list what mainstream Intel desktop CPUs have an iGPU that is added as a licensed complete IP block from an external partner.

Intel has used licensed GPU technology from PowerVR in multiple products and is even using a licensed GPU block from ARM in X3.
 
Software (current) loves Intel's single thread performance, and we still live in that world.

If / when things change over to HSA and GPU compute on personal computer, then AMD buying ATI was 100% the right thing to do.

Maybe Windows 10 and Direct X 12 will cast the Bulldozer design choice in a more favorable light. (compared to the single CPU desktop configs like a lot of us have, i5 / i7 post Conroe chips)

Software will always always love single thread performance. Its true in the desktop market and it is even true in the server market where the available parallelism is orders of magnitude higher. Single thread performance largely defines the latency curves of any software implementation and as such will always be a critical feature. Not to mention, that the majority of code is not easily to parallelize do to divergence or indirection.
 
The console chips are more than enough to fulfill the WSA quota and given the legacy node it is perfect suitable to that subpar foundry of them.

It is a good question why AMD fabs those at TSMC?

P.S. According to the following link the next Xbox One chip will be fabbed on TSMC 20nm --> http://www.dailytech.com/Xbox+One+S...er+Cooler+20+nm+APU+From+AMD/article36813.htm

Even though that is not an official source of info, the previous MS console (Xbox 360) processor received two die shrinks during the product's lifespan 90nm --> 65nm --> 45nm. So hopefully we see at least one of the shrinks for Xbox One happen at GF. This, in addition, to the PS4.
 
If Sony and MS need die-shrinks on their APUs, I'm pretty sure GF has a suitable 20nm process (the one they're allegedly using on Nolan and Amur) ready for it. Of course, I'll believe it when I see it.
 
If Sony and MS need die-shrinks on their APUs, I'm pretty sure GF has a suitable 20nm process (the one they're allegedly using on Nolan and Amur) ready for it. Of course, I'll believe it when I see it.


GF 20nm is cancelled. They are moving from 28nm to Samsung 14nm..
 
You sure? Granted I had thought both Nolan and Amur were going to be 20nm, but . . .

http://wccftech.com/amds-x86-nolan-apu-fabricated-28nm-process-20nm/

20nm Amur and 28nm Nolan? Again it's wccftech so make of it what you will. Nolan's on it's way though, updates for it are showing up in some software like AIDA64.


Well, the skybridge slides still list 20nm(but could be TSMC), but that's about all the evidence I've seen of anything 20nm.

AMD has been saying that 28nm parts(eg carrizo) will be replaced by finfet parts.

amd-skybridge-roadmap-2-100266192-orig.png
 
Last edited:
In what way have office type tasks changed in the last 5 years so that they now should require 3-4x the amount of GPU performance compared to in 2010?

Try compare websites back then and now for example. And that include the companies own websites and data products.

Plus again, its not just sold to offices. Its also sold to average Joe.

In that timeframe we for example also got 3D accelerated flash and hardware accelerated browsers.
 
Last edited:
You still dont get it. Nobody buys IGPs for its performance.

You still dont get it, nobody buys Celerons/Pentiums for their CPU performance. Current entry to medium level AMD and Intel CPU performance is good enough for 90% or more of the consumers.

The reason Intel sells more has nothing to do with CPU/iGPU performance for those segments.
 
You still dont get it, nobody buys Celerons/Pentiums for their CPU performance. Current entry to medium level AMD and Intel CPU performance is good enough for 90% or more of the consumers.

The reason Intel sells more has nothing to do with CPU/iGPU performance for those segments.

The funny thing is, reading reviews of AMD powered desktops sold at Best Buy, all the complaints are Windows 8.1 gripes, or OEM design choices (can't upgrade graphics card, no physical PCI-E slot, and no push for SSD means slow HDDs still).

Intel is bigger than AMD, they have an amazing supply chain and a crazy amount of products targeted for OEM integration.

Looking at All In One PC's, sorting by best selling:

http://www.bestbuy.com/site/searchp...abcat0501005&qp=soldby_facet=Sold By~Best Buy

Number 1 is the $1,269.99 iMac

Number 2 is the $679.99 A8 powered HP Pavilion

AMD *is* doing something right, but in a lot of hardcore enthusiast eyes they are a complete and utter failure.
 
Try compare websites back then and now for example. And that include the companies own websites and data products.

Plus again, its not just sold to offices. Its also sold to average Joe.

In that timeframe we for example also got 3D accelerated flash and hardware accelerated browsers.

Not nearly enough to justify requiring 3-4x performance improvement. I.e. the iGPU performance has increased far more than the average "office type workload" during the last 5 years.

So it looks like Intel does not consider the iGPU to be "good enough", nor did they do that 5 years ago.
 
Last edited:
Not nearly enough to justify requiring 3-4x performance improvement.

So it looks like Intel does not consider the iGPU to be "good enough", nor did they do that 5 years ago.

Intel iGPU is already good enough for most scenarios outside gaming, what they are doing is to extend the performance envelope in order to capture more revenue share from this market and extend the performance envelope of Ultrathin devices. Basically this extra GPU performance doesn't add much value beyond certain market brackets. That said, they are doing that with incremental improvements each generation and without sacrificing their cost structure, that's why you are seeing Core i3 chips around 100mm^2 since the Sandy Bridge generation, only now at 14nm Intel is reaching the 50% GPU proportion.

AMD took a completely different approach, giving since 32nm 50% of its chips to the iGPU and in the meantime trashing its cost structure with this decision.

Intel certainly could use more iGPU power, but by no means this vindicates AMD approach, quite the opposite, the fact that a better iGPU couldn't make up for the poor performance of their processors is a testament of how dumb was the decision of devoting 50% of their die area to something that doesn't really matter for the majority of the market.
 
Back
Top