• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Intel Iris & Iris Pro Graphics: Haswell GT3/GT3e Gets a Brand

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
So I take it there isn't information on what sort of EU setup the mobile Haswell Pentium and i3s will be running? Guess it's too soon, probably won't see these models until winter or later.
 

lagokc

Senior member
Mar 27, 2013
808
1
41

Racan

Golden Member
Sep 22, 2012
1,321
2,403
136
Sooner or later the ROI for discrete cards will be too low to continue.

So Intel will end up destroying PC gaming eventually. How can it survive on Intel's integrated GPUs as the only option left on the market?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I guess "16.8 fps 1280x800 2xAA" technically falls into the category of 'will play' though I don't know many people that would actually attempt to play through the game on a slideshow like that.

http://www.techpowerup.com/reviews/Intel/Core_i7_3770K_Ivy_Bridge_GPU/19.html

If you're happy only playing games released before 2006 then Ivy Bridge's iGPU should be fine.

Clearly 100% of the tablet market are looking for a tablet or ultrabook that can play crysis 3 at max settings :rolleyes:

In fact, it's clear that gaming is the primary reason that macbooks and ipads are not selling well /sarcasm. They can't play crysis 3 at max settings folks! Intel should just call it a day and give up. Getting their chips into the smallest ultra portable devices is a fruitless endeavor, because of crysis 3.
 
Last edited:

Blandge

Member
Jul 10, 2012
172
0
0
I guess "16.8 fps 1280x800 2xAA" technically falls into the category of 'will play' though I don't know many people that would actually attempt to play through the game on a slideshow like that.

http://www.techpowerup.com/reviews/Intel/Core_i7_3770K_Ivy_Bridge_GPU/19.html

If you're happy only playing games released before 2006 then Ivy Bridge's iGPU should be fine.

I have testimonial evidence from a friend who played Skyrim with HD4000 graphics for quite some time, and he said it was very playable (Using auto-detect settings). So much so that he beat the entire game with it without any issue.
 

lagokc

Senior member
Mar 27, 2013
808
1
41
Clearly 100% of the tablet market are looking for a tablet or ultrabook that can play crysis 3 at max settings :rolleyes:

In fact, it's clear that gaming is the primary reason that macbooks and ipads are not selling well /sarcasm. They can't play crysis 3 at max settings folks! Intel should just call it a day and give up. Getting their chips into the smallest ultra portable devices is a fruitless endeavor, because of crysis 3.

You're missing the point which was that the iGPU still can't play games well. If you don't want to play games you're fine with an iGPU, if you want to play modern games you need a real GPU or at the very least an A10, end of story.
 

inf64

Diamond Member
Mar 11, 2011
3,884
4,692
136
Haswell will go against Richland (+20% GPU side, +10-15% CPU side Vs Trinity 4600M/5800K). All of Haswell parts that are below GT3 with edram will most likely end up slower than mobile 5750's GPU (that's 4600M's successor in mobile segment,based on Richland). Even those parts will most likely cost considerably more than 5750 , so AMD will have noticeable price advantage. GT3e can be faster I suppose but with a huge price premium and for the price range of sucha notebook one would WANT to get a mobile Geforce of Radeon instead of iGPU.

When you are about to shell out major $$$$ you want something that is better than GT3e. So my point is that I have no clue where intel will slide Gt3e mobile parts and who will buy this kind of HW setup if there is no Optimus switchable graphics in there. This means that you actually buy only CPU part of Haswell and GT3e will do nothing for you in anything GPU related (maybe QuickSync can be used,I don't know). AMD will hold a great price advantage for a part that will be better Vs majority of iGPU range found in Haswell and more than good enough x86(CPU) part - 2.5-3.5Ghz Trinity will do anything you throw at it albeit a lot slower than 8T Haswell could do it (for a fraction of Haswell's price).
 

lagokc

Senior member
Mar 27, 2013
808
1
41
GT3e seems like a practice part more than anything. Intel can't go straight from absolutely terrible graphics to a winner in one generation so they're putting a lot of effort into improving their architecture and drivers so that sometime around Skymont they might finally catch up to AMD in the graphics department.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
You're missing the point which was that the iGPU still can't play games well. If you don't want to play games you're fine with an iGPU, if you want to play modern games you need a real GPU or at the very least an A10, end of story.

The entire point, which is lost on you, is that nobody cares. Intel is intent on putting this technology into ultra portable computing devices, and they're succeeding. Additionally, there are other metrics that matter a lot more than desktop computing and dedicated gaming. Sure, gaming is nice but the intel is creating these devices to drive higher resolution retina displays and to do so at extremely low power consumption levels. Intel is absolutely slaughtering AMD in this respect, AMD cannot compete on any metric except desktop graphics. That's it. The A10 is a desktop graphics part and intel does not care about desktop graphics.

So AMD can make a graphics chip that does okay in a mini ITX box. Unless you're completely oblivious to the market, that market is headed to ultra portable devices such as tablets and super slim ultrabooks. In these devices, intel is the superior product, period. Nobody buys a macbook air for the ability to play crysis 3. Nobody buys an ipad for crysis 3. Yet these products sell millions because the average consumer wants an ultra portable product with higher resolution displays - intel is creating technology to be the best for that market. AMD isn't. Intel is creating a product which removes the need entirely for discrete graphics - Apple can now produce macbook pro portables without discrete nvidia chips. AMD cannot do that.

It's time to stress again that nobody cares about how well the A10 does in crysis 3. If intel can get GT650M level graphics in a product with 10 hours of battery life, that is a product which will sell like hotcakes. AMD cannot create a product with that balance of efficiency and graphics performance.

And that is what intel is going after - a delicate balance of efficiency with excellent graphics performance capable of driving high resolution displays. Now, does AMD have such a product which can fit into an AIO or ultrabook with the same power efficiency? No. In fact, I believe that a product of the same efficiency level from AMD would have worse graphics performance than GT3.
 
Last edited:

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
AMD will hold a great price advantage for a part that will be better Vs majority of iGPU range found in Haswell and more than good enough x86(CPU) part - 2.5-3.5Ghz Trinity will do anything you throw at it albeit a lot slower than 8T Haswell could do it (for a fraction of Haswell's price).


Do you realize that this "price advantage" is effectively killing AMD?
 

MightyMalus

Senior member
Jan 3, 2013
292
0
0
So my point is that I have no clue where intel will slide Gt3e mobile parts and who will buy this kind of HW setup if there is no Optimus switchable graphics in there. This means that you actually buy only CPU part of Haswell and GT3e will do nothing for you in anything GPU related (maybe QuickSync can be used,I don't know).

If a 15"-17" notebook with GT3e ends up $150-$300 cheaper than one with a dgpu, I'd be interested in one.

The way I see it GT3>Richland>NewGT2. And considering that AMD keeps saying that Kaveri will be a 2013 chip....AND considering that the GT3e is rumored to be out around October. I'll wait for more info.

If intel can get GT650M level graphics in a product with 10 hours of battery life, that is a product which will sell like hotcakes.

True, but Intel can't. Don't mix the High Performance Quad Core with the Ultrabook chips...which don't have the GT3e.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
If a 15"-17" notebook with GT3e ends up $150-$300 cheaper than one with a dgpu, I'd be interested in one.

The way I see it GT3>Richland>NewGT2. And considering that AMD keeps saying that Kaveri will be a 2013 chip....AND considering that the GT3e is rumored to be out around October. I'll wait for more info.



True, but Intel can't. Don't mix the High Performance Quad Core with the Ultrabook chips...which don't even have the GT3e.

The bottom line is that intel is creating a fine balance between graphics performance with as little power consumption as possible. With all metrics considered, intel has better products than AMD at every power consumption level.

Intel doesn't care about chasing the gaming market - that isn't the intent of GT3E. I'm sure it will game fairly well since it matches the nvidia GT650M, yet that isn't intel's sole intent. They *want* this technology in smaller and more efficient devices with high resolution displays. By the time broadwell comes around, they will make another leap towards doing that. The main theme i'm getting at is that you can cite gaming performance all you want, but that isn't a driving motivator for intel. Period.
 

inf64

Diamond Member
Mar 11, 2011
3,884
4,692
136
Intel doesn't care about chasing the gaming market - that isn't the intent of GT3E. I'm sure it will game fairly well since it matches the nvidia GT650M, yet that isn't intel's sole intent.
Benchmarks please. No making stuff up.
 

MightyMalus

Senior member
Jan 3, 2013
292
0
0
I'll raise my hand here on something!

Can someone please explain to me how the higher performing Haswell-U line has a lesser number than the lower performing one? --> i7-4650U(15w) vs i7-4558U(28w) <--
Isn't it confusing?

Intel doesn't care about chasing the gaming market - that isn't the intent of GT3E. I'm sure it will game fairly well since it matches the nvidia GT650M, yet that isn't intel's sole intent.

It does not match the GT650M, atleast not by 3DMark11 score.
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
I'll raise my hand here on something!

Can someone please explain to me how the higher performing Haswell-U line has a lesser number than the lower performing one? --> i7-4650U(15w) vs i7-4558U(28w) <--
Isn't it confusing?

It's very simple. They are putting a premium on lower TDP.

It does not match the GT650M, atleast not by 3DMark11 score.

The 47W version should reach 2200 3DMark11 points and and 55W cTDPup version should get 2300. If 3DMark11 is a direct indicator, it'd be in line with GT 650M. I don't think they are aiming for the fastest GT 650M, but middle. We shall see though.

http://en.wikipedia.org/wiki/Compar...ocessing_units#GeForce_600M_.286xxM.29_Series
 
Last edited:

lagokc

Senior member
Mar 27, 2013
808
1
41
So Intel will end up destroying PC gaming eventually. How can it survive on Intel's integrated GPUs as the only option left on the market?

Fortunately in order for iGPUs to destroy the PCIe card market they have to actually become powerful enough to compete with standalone cards. If they don't become that powerful they can't compete, if they do become that powerful then we have iGPUs that are powerful and PC gaming won't die. Gamers win either way.

Worst case (as in over the top doomsday) scenario is Intel's iGPUs eat away at the standalone card market, AMD goes bankrupt and ceases to make chips, Intel seeing no competition stops support for the PCIe x16 slot in order to establish a monopoly on GPUs in order to increase their % of the bill of sale. nVidia responds by selling 64-bit ARM chips with the latest nVidia graphics for desktops and laptops and Intel/nVidia duke it out for a few years. Things get weird when WindowsRT/Linux become the dominant PC gaming platforms though...

That reads like a sci-fi horror novel doesn't it?
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Fortunately in order for iGPUs to destroy the PCIe card market they have to actually become powerful enough to compete with standalone cards.

No they don't.

They just have to become powerful enough that the standalone (discrete) card market becomes unsustainably small. Eliminate the low end cards where most of the sales volumes are, and then all the costs for the companies producing the chips and cards have to be spread over a much lower number of cards.

So now the entry point of a decent gaming card isn't $150 anymore, now it's $400. Very few people will spend that much money.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
No they don't.

They just have to become powerful enough that the standalone (discrete) card market becomes unsustainably small. Eliminate the low end cards where most of the sales volumes are, and then all the costs for the companies producing the chips and cards have to be spread over a much lower number of cards.

So now the entry point of a decent gaming card isn't $150 anymore, now it's $400. Very few people will spend that much money.

This is correct. Nvidia has a lot to lose because they own most of the market and don't have their own APU. They do still have a chance because x86 is becoming less relevant with each passing day and Android is winning the OS war.

Intel's pricing will mitigate the damage they could do. AMD still has a chance to win with crossfire capability but they must improve vastly in that area.
 

386user

Member
Mar 11, 2013
66
0
16
what desktop gpu does the gt2/gt3/gt3e closely match?

or is this not available yet...

edit: mobile haswell.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Fortunately in order for iGPUs to destroy the PCIe card market they have to actually become powerful enough to compete with standalone cards.

No they don't.

They just have to become powerful enough that the standalone (discrete) card market becomes unsustainably small. Eliminate the low end cards where most of the sales volumes are, and then all the costs for the companies producing the chips and cards have to be spread over a much lower number of cards.

So now the entry point of a decent gaming card isn't $150 anymore, now it's $400. Very few people will spend that much money.

lagokc, the analogy you need to conceptualize here is one in which Phynaz has direct and personal experience with...namely the utter slaughter that low-cost low-performing (relatively) x86 CPUs from Intel did to the big-iron processor segment in the 90's.

x86 was like a wind-blown wild-fire in that market demographic and it left every legacy big-iron provider in bankruptcy or irrelevance (other than IBM and SUN, although one could argue neither are all that relevant now).

Now the point here is not that x86 was special or made magically disruptive market moves, because it wasn't. x86 was simply disruptive enough as to destabilize the tenuous economics of the existing big-iron market segment...and once it tipped that balance past the destabilization point where revenue could no longer fund enough R&D as needed to keep big-iron low-volume CPUs ahead of the performance advances low-cost high-volume x86 CPUs the big-iron market as it was then known was dead.

The same slippery slope can happen in discrete GPUs. Once low-cost iGPU/APU products hit critical volumes and manage to revenue-starve the R&D engines for future discrete GPU products the game will be over for them as their amortized costs will spiral out of control as the volumes sold get lower and lower.

It is a tale that is not unique to x86, it happens the world over in industry and nation alike. See what cheap labor in Asia and open markets via the World Trade Organization did to manufacturing in Europe and N.America. It is a rather generic and universal phenomenon.
 
Aug 11, 2008
10,451
642
126
lagokc, the analogy you need to conceptualize here is one in which Phynaz has direct and personal experience with...namely the utter slaughter that low-cost low-performing (relatively) x86 CPUs from Intel did to the big-iron processor segment in the 90's.

x86 was like a wind-blown wild-fire in that market demographic and it left every legacy big-iron provider in bankruptcy or irrelevance (other than IBM and SUN, although one could argue neither are all that relevant now).

Now the point here is not that x86 was special or made magically disruptive market moves, because it wasn't. x86 was simply disruptive enough as to destabilize the tenuous economics of the existing big-iron market segment...and once it tipped that balance past the destabilization point where revenue could no longer fund enough R&D as needed to keep big-iron low-volume CPUs ahead of the performance advances low-cost high-volume x86 CPUs the big-iron market as it was then known was dead.

The same slippery slope can happen in discrete GPUs. Once low-cost iGPU/APU products hit critical volumes and manage to revenue-starve the R&D engines for future discrete GPU products the game will be over for them as their amortized costs will spiral out of control as the volumes sold get lower and lower.

It is a tale that is not unique to x86, it happens the world over in industry and nation alike. See what cheap labor in Asia and open markets via the World Trade Organization did to manufacturing in Europe and N.America. It is a rather generic and universal phenomenon.

That is possible, but what about workstation cards? Will IGPs ever replace those? If they continue to be made, a big part of the R and D could be amortized to that.

Edit: Again, by the time Kaveri comes out, or close thereafter, AMD should have 20nm discrete gpus out, which will raise the bar even further. To my mind igps will always be playing catch-up to steadily improving discrete cards.
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
That is possible, but what about workstation cards? Will IGPs ever replace those? If they continue to be made, a big part of the R and D could be amortized to that.

Sure, but prices will go up accordingly.

No different than what you see in terms of pricing commanded by Itanium, SUN Sparc, or IBM Power processors now...products which live in that low-volume high-ASP niche already.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
No they don't.

They just have to become powerful enough that the standalone (discrete) card market becomes unsustainably small. Eliminate the low end cards where most of the sales volumes are, and then all the costs for the companies producing the chips and cards have to be spread over a much lower number of cards.

So now the entry point of a decent gaming card isn't $150 anymore, now it's $400. Very few people will spend that much money.

There is only one problem:
Intel will lose to the ARM SoCs. All of them will become powerful enough for casuals. So instead of buying a Haswell CPU they going and buy a product with Tegra X.
If there is no reason to buy discrete GPUs then there is no reason to buy Haswell.