Haswell will rival graphics performance of today's discrete cards!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
That Llano die shot is the cropped version. This fact has been made known for quite a time.

A non-cropped die shot shows the GPU portion to consume a considerable amount of die space, see the Llano die at the center of this AMD slide. At first glance, it is a good 1/3 of the die at least.

And it is a good 2/3's of the xtor budget.

(not that this effects any of the accounting metrics of relevance, just thought I'd throw that out there because it really speaks to the fact that the core logic circuits represent a rather diminutive portion of the total circuit space on the chip)

...as there's not much point in >4 cores for mainstream desktop SKUs and single core IPC is already pretty good.

The number of cores, and the IPC of the cores, is a necessity for the sake of competitive reasons. It is not a "nice to have", it is a "need to have" when your competitor has it.

IPC could be just fine, but if your competitor is fielding a product with 2X the IPC then you are going to have issues selling your "already pretty good IPC" products.

There is a reason Intel has endeavored to make the IPC of Sandy Bridge as strong as it is. It might be overkill for the vast majority of consumers but consumers have historically been willing to pay for the privilege of owning an overkill CPU.

We do it with everything, be it our TV's or our cars or our refrigerators.

Do I need 16GB of ram? Not at all, but it costs so little compared to the rest of my investment in my system to go ahead and equip it with 16GB versus 8GB or 12GB that I do it anyways. Total overkill and yet an entire industry exists on the basis of creating 4GB dimm densities.

I'm glad llano is here, its a great second milestone (zacate was a great starting point for AMD), but the third and subsequent milestones are going to need to be paired with something that has a lot more horsepower in the CPU dept if they hope to be competitive in the markets with whatever Intel has cooking up for 2012 and 2013.
 

GammaLaser

Member
May 31, 2011
173
0
0
The number of cores, and the IPC of the cores, is a necessity for the sake of competitive reasons. It is not a "nice to have", it is a "need to have" when your competitor has it.

Certainly.

IPC could be just fine, but if your competitor is fielding a product with 2X the IPC then you are going to have issues selling your "already pretty good IPC" products.

Sure, except the competitor doesn't have such a thing, and even if they did, suddenly designing a CPU with 2x the IPC of the previous generation is next to impossible these days without going overboard on power or die area.

There is a reason Intel has endeavored to make the IPC of Sandy Bridge as strong as it is. It might be overkill for the vast majority of consumers but consumers have historically been willing to pay for the privilege of owning an overkill CPU.

We do it with everything, be it our TV's or our cars or our refrigerators.

Do I need 16GB of ram? Not at all, but it costs so little compared to the rest of my investment in my system to go ahead and equip it with 16GB versus 8GB or 12GB that I do it anyways. Total overkill and yet an entire industry exists on the basis of creating 4GB dimm densities.

I'm glad llano is here, its a great second milestone (zacate was a great starting point for AMD), but the third and subsequent milestones are going to need to be paired with something that has a lot more horsepower in the CPU dept if they hope to be competitive in the markets with whatever Intel has cooking up for 2012 and 2013.

I just think graphics performance is the new low-hanging fruit when it comes to what needs to be improved in the upcoming architectures for Intel.

Intel doesn't own a graphics division selling discrete chips, so they have nothing to lose by trying to bring integrated graphics into the performance categories occupied by the discrete cards of the past.

And it would be a lot easier to convince people to upgrade their CPUs if the difference is much more noticable (e.g. graphics), not just more cores (need better software support for that), or more single threaded performance (where Sandy Bridge class CPU's are "good enough" for most apps--web browsing, office, etc.) Where AMD now needs to focus on their CPU performance after introducing a nice IGP with Llano, Intel should go in the opposite way with their IGP's. A fast, well-balanced all-in-one CPU/GPU combo would be a real winner.

I'm certainly speaking to the general public who buys OEM PCs primarily in terms of absolute $$$, not the enthusiast crowd who sees little reason to see their precious CPU die area increasingly being consumed by unused IGP logic. :D
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I just hope I still can buy CPU's with none of the IGP/APU crap...I'm a dedicated GPU man...my life is to short for onboard "meh"...
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Im sure Intel, with its vast bulk of influence, can make pretty much everything happen.
Larrabee is a good example of how unlimited resources are not always a recipe for success.
I dont really get the "they are 3 or so generations behind AMD and Nvidia" argument, as they have access to each new tech part the day they get launched and can dissect and experiment at their own pleasure.
Intel is at least 3 generations behind, and it falling further behind.
Intel has read the writing on the walls, its says "Fusion/APU" and they wont just stand around idle watching AMD make AND eat the cake too..
So what is Intel waiting for? AMD is moving forward with Fusion, where is Intel's answer? When will Intel have a first generation DX11 core? By the time they do, AMD and Nvidia will be well into their 3rd or even 4th iterations. If Intel could lead in graphics, they would. To say otherwise is silly.
There is a reason Intel has endeavored to make the IPC of Sandy Bridge as strong as it is. It might be overkill for the vast majority of consumers but consumers have historically been willing to pay for the privilege of owning an overkill CPU.
It's hardly that simple. Intel is #1 in graphics, despite having the poorest offerings. The consumer has been "force fed" Intel products for the better part of a decade (if not longer). While I applaud Intel for continuing to push technology forward, Intel's market share has less to do with product competency and more to do with market dominance through questionable practices. Right now, Intel has a very unbalanced platform, 99% of people simply don't need a faster CPU, they just don't.
I'm glad llano is here, its a great second milestone (zacate was a great starting point for AMD), but the third and subsequent milestones are going to need to be paired with something that has a lot more horsepower in the CPU dept if they hope to be competitive in the markets with whatever Intel has cooking up for 2012 and 2013.
All the CPU power in the world is pointless without competent graphics, at least for the vast majority of the market. People are so hung up on "AMD's CPU performance is too slow" but give Intel a complete pass when their graphic performance is borderline worthless. Going by history, AMD is capable of having the performance lead in CPUs, Intel has never done this with graphics. Something has to drastically change at Intel for them to finally take the GPU seriously.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I just hope I still can buy CPU's with none of the IGP/APU crap...I'm a dedicated GPU man...my life is to short for onboard "meh"...

Do you feel that way about the FPU? x87? SSE? AVX?

The APU concept is really little more than a glorified ISA extension. Back in the day such a monumental advance in technology would have garnered little more outward distinction than a shifting of the product name from an "SX" to a "DX"...nowadays we have the similar transitions heralded from on high as "the future is FUSION!!!!!!!!"

Regardless the marketing involved, progress has a funny way of intentionally NOT being "we've designed it so that one size will fit all" but rather of being "one size WILL fit all, regardless whether or not you like how it fits".
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Back in the day such a monumental advance in technology would have garnered little more outward distinction than a shifting of the product name from an "SX" to a "DX"...nowadays we have the similar transitions heralded from on high as "the future is FUSION!!!!!!!!"
The more I read this, the more I think you must be kidding, right? The inevitable melding of the traditional CPU and GPU is hardly akin to a very minor architectural tweak. Taking the best aspects of the CPU and GPU and combining them into a heterogeneous core is one of the most exciting things to come along since we've been using home computers. There are so many possibilities affording by doing this, a chip that seamlessly utilizes the strengths of a CPU and GPU for whatever workload you toss at it is fantastic, very much looking forward to this.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I just changed the OP to reflect the fact that it was actually DT's claim, not intel's, that haswell would be as good as today's discrete offerings.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
The more I read this, the more I think you must be kidding, right? The inevitable melding of the traditional CPU and GPU is hardly akin to a very minor architectural tweak. Taking the best aspects of the CPU and GPU and combining them into a heterogeneous core is one of the most exciting things to come along since we've been using home computers. There are so many possibilities affording by doing this, a chip that seamlessly utilizes the strengths of a CPU and GPU for whatever workload you toss at it is fantastic, very much looking forward to this.

If only the performance of this combined PR-dud wasn't so MEH...you might have a point.
But I see no benefit in IGP/APU's what so ever...when a dedicated GPU makes IGP/APU's look like a joke.
 

tijag

Member
Apr 7, 2005
83
1
71
If only the performance of this combined PR-dud wasn't so MEH...you might have a point.
But I see no benefit in IGP/APU's what so ever...when a dedicated GPU makes IGP/APU's look like a joke.

If having a way to write code which will automagically take advantage of the best resource available, whether that be the GPU portion, or the CPU portion, even a 'joke' APU would still offer a kind of flexibility and computing power that right now isn't available.

This is certainly going in babysteps though. We aren't yet at the destination, but it seems we are on the road to something interesting.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
If only the performance of this combined PR-dud wasn't so MEH...you might have a point.
But I see no benefit in IGP/APU's what so ever...when a dedicated GPU makes IGP/APU's look like a joke.
I find your post very short sighted.
 

GammaLaser

Member
May 31, 2011
173
0
0
If only the performance of this combined PR-dud wasn't so MEH...you might have a point.
But I see no benefit in IGP/APU's what so ever...when a dedicated GPU makes IGP/APU's look like a joke.

Special attention needs to be paid to the target market which was being referred to in the OP...and that is this new PC segment Intel calls the "ultrabook." Intel wants to reach the people who are looking for laptop (e.g. better than netbook/Atom) performance in a tablet-esque form factor. For this segment to work from an technical standpoint, Intel needs to be able to cram as much functionality into as little space as possible, consume as little power as possible, and be inexpensive, which can only happen with an APU that gives good CPU AND graphics performance, combined with power-saving advancements like 22nm TG transistor tech.

That's where the motivation to get "discrete" class performance in an IGP comes from.
 
Last edited:

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
The more I read this, the more I think you must be kidding, right? The inevitable melding of the traditional CPU and GPU is hardly akin to a very minor architectural tweak.

Dude, the FPU used to be an optional second chip that cost more than the CPU.

It's exactly the same thing.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Do you feel that way about the FPU? x87? SSE? AVX?

The APU concept is really little more than a glorified ISA extension. Back in the day such a monumental advance in technology would have garnered little more outward distinction than a shifting of the product name from an "SX" to a "DX"...nowadays we have the similar transitions heralded from on high as "the future is FUSION!!!!!!!!"

You realize also where we are now in comparison to where we were "back in the day"?

Thats the single most important reason for why it is "heralded from on high". . Marketing and "telling people" that you are right and that you know the future and will plan accordingly is key to all modern business practices.

The APU isnt a concept anymore, its here, there, working and doing a great job at it, despite some peoples obvious disdain for it. Will it be all that AMD "heralds" it will?, time will show.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
If only the performance of this combined PR-dud wasn't so MEH...you might have a point.
But I see no benefit in IGP/APU's what so ever...when a dedicated GPU makes IGP/APU's look like a joke.
Comparing to notebooks that have that option, the base price will go up. Better IGP only competes with the lower end of discrete GPUs.

If you see no need for good IGPs, then ignore them. The people who really wanted Atom to improve over the years, who wouldn't know what to do with their CPU power, but get annoyed at stuttering videos and jerky windows, and would prefer to save a buck (or are forced to by their corporate masters) have good use for good IGP. IGP is good enough CAD, now, forchristssake.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
The more I read this, the more I think you must be kidding, right? The inevitable melding of the traditional CPU and GPU is hardly akin to a very minor architectural tweak. Taking the best aspects of the CPU and GPU and combining them into a heterogeneous core is one of the most exciting things to come along since we've been using home computers. There are so many possibilities affording by doing this, a chip that seamlessly utilizes the strengths of a CPU and GPU for whatever workload you toss at it is fantastic, very much looking forward to this.

Perhaps you remember the "math co-processors"? And "a minor architectural tweak" to make it "disappear"? How about using your own words to see if they fit:

Taking the best aspects of the CPU and math co-processor and combining them into a heterogeneous core is one of the most exciting things to come along since we've been using home computers. There are so many possibilities affording by doing this, a chip that seamlessly utilizes the strengths of a CPU and math co-processor for whatever workload you toss at it is fantastic, very much looking forward to this.
Fits. And lo and behold, it turned out so awesome that you already take it for granted. The FPU in your CPU today used to be a separate chip, was very expensive, and is right now far more useful in general computing than your GPU.

The melding of CPU and GPU, as you call it, is purely marketing at this point, and there is no reason at the moment to herald it as "the best thing ever" and even go as far as relegating every other similar advance in CPUs as "minor architectural tweaks". I am not saying this APU thing will not amount to anything, but it certainly isn't the best thing to have ever happened.


The APU isnt a concept anymore, its here, there, working and doing a great job at it, despite some peoples obvious disdain for it. Will it be all that AMD "heralds" it will?, time will show.
From Intel and AMD, yes, the APU is "here". I would say that all the excitement is still "promises" - and executives of Intel, AMD, and their board partners, figuring out how much they are earning and saving in platform costs. What does this APU from Intel/AMD offer that the on-board IGPs did not already offer? If Intel/AMD just made this a better on-board IGP, instead of tacking it on the CPU, would you have noticed any difference at all? The manufacturers would (their engineers, designers, and their accountants), but I doubt anybody else really would.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
How about using your own words to see if they fit:
Fits. And lo and behold, it turned out so awesome that you already take it for granted. The FPU in your CPU today used to be a separate chip, was very expensive, and is right now far more useful in general computing than your GPU.
I am not sure I understand your line of thinking. You are applying my words and comparing the APU to the FPU, and then stating how useful and valuable the FPU turned out to be as part of the CPU. Fair enough.

But then you say this:
The melding of CPU and GPU, as you call it, is purely marketing at this point...
The functionality (and importance) of the GPU is not exactly trivial, to call an APU "pure marketing" is highly dubious IMO. Being in its infancy, naturally the capabilities (and actual abilities of the architecture) are just beginning. But to apply my words to extol the virtues of one thing, then use the same words to minimize another, is disingenuous to me. I may have been guilty of the same thing, but not intentionally. I don't see the FPU as unimportant, but I certainly don't see it on the same level as integrating a fully functional, stand alone processor into the CPU.

You are severely selling short the importance of graphics going forward IMO. Only time will tell, but feel free to come back to this thread 2-3 years later, I am 100% confident in my assertions. Let's not forget Larrabee, a very ambitious attempt to meld the traditional CPU with graphics capabilities, so don't tell me that a graphics processor is not something that is key to the success of Intel, AMD and others. Intel knows it, they didn't spend several billion dollars just because they were bored with themselves. An APU like device IS the future, there is no denying it. If Intel just wanted to get into discrete graphics, they would have made a traditional GPU and been done with it, but they went down the inevitiable road of an APU (for lack of a better term, not sure I like that moniker) and came up short, at least for now.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
I think you are caught up too much in your selectively quoted line. Immediately after that, I qualified it with "at the moment". My real "beef" with your post that I tried to get across was you downplaying everything else (including the disappearance of the math co-processors into the FPU of CPUs) to magnify the importance of the APU, whereas:
a.) This importance has not been shown yet
b.) It currently offers nothing on-board graphics could not have offered
c.) There is no indication beyond marketing (at this point) of how awesome this will really turn out to be, whereas the past advancements that you so easily dismissed as "minor tweaks" have already been proven.

That's my one and only "beef".

It is easy to get swept up with the marketing of the day and believe this is the second coming. It would be good for both Intel and AMD if this would be the case, and for us as well since any advancement is cool for us. But as of the moment, this has not yet produced results so great that other similar advances in the past become rightfully relegated as "minor architectural tweaks"

If you had not so callously dismissed the other advances as "minor architectural tweaks", I would have totally no beef against your post.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
I know the focus here is "graphics" but is it possible that Hawell will have GPGPU functions as well? Heterogeneous computing is a big part of why the APU exists and why AMD wants to do more than graphics with the graphics cores.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
I know the focus here is "graphics" but is it possible that Hawell will have GPGPU functions as well?
I certainly hope so. The sooner the GPU is software supported as a general purpose device, the better. I'm a bit surprised more people are not excited about the APU, I honestly believe it will open up a whole new realm of computing. The downside is as usual, the software will lag and it will take longer than I'd like, which is why the more hardware out there that supports it, the better.
 

WhoBeDaPlaya

Diamond Member
Sep 15, 2000
7,415
404
126
I don't understand why folks are poo-pooing ~GT240 level graphics capabilities.
It's enough to play quite a lot of titles out there, eg. older but still relevant ones such as Guild Wars, TF2, CS:S, L4D, etc.
 

GammaLaser

Member
May 31, 2011
173
0
0
I certainly hope so. The sooner the GPU is software supported as a general purpose device, the better. I'm a bit surprised more people are not excited about the APU, I honestly believe it will open up a whole new realm of computing. The downside is as usual, the software will lag and it will take longer than I'd like, which is why the more hardware out there that supports it, the better.

Yep, that AMD slide posted on AT's front page reminded me of this :)

evolving2.jpg
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
I don't understand why folks are poo-pooing ~GT240 level graphics capabilities.
It's enough to play quite a lot of titles out there, eg. older but still relevant ones such as Guild Wars, TF2, CS:S, L4D, etc.

Agree with this 100%. As a sidetrack, imagine the possibilities of having a "console" like computer in every home, instead of the office machines with a built in Intel "iworkreallyslow" igp.

You want to play a certain game with your friend over the internett? just download it. With the increased power in stuff like Llano and presumably Trinity (and future Intel contributions) you have the option and possibility.

The days when you HAD to have a discreet graphics card to game seem to be over, and that is just great imo. The less you restrict, the more availability you create. This could catapult interest in gaming for many people out there.