Intel Iris & Iris Pro Graphics: Haswell GT3/GT3e Gets a Brand

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MightyMalus

Senior member
Jan 3, 2013
292
0
0
That old table does not have the 4770R. 65W is part of the GT2 line.

Intel made a design switch at the last minute? Really?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Ah, okay, I see what you mean. Yeah, I heard the same thing about the GT3e chips--only mobile chips can get the eDRAM. But I think some desktop chips have the GT3, at least (though weaker ones are more common since the enthusiast need for an APU compared to CPU+dGPU is pretty low right now).

No Desktop chips in the form most think of has GT3. Only the R models, are the GT3 ones which is BGA.

Personally with all the news of PCs, especially Desktops being declared dead nowadays, I'm not sure why Intel needs more focus on there(traditional desktop) at all. If anything, the focus towards more mobile should be done with greater intensity. Since nothing is free, that naturally means less and less focus on the desktop.

Charity like business are cool and all but they don't last. :)
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I wonder what the performance increase of the GT3e would be if it would not have had the 128 MB eDRAM? Is it memory bandwidth bottlenecked to such a degree that it was essential to add it to achieve any reasonable performance improvement?

Well, its a GPU after all. :)

We can also think of it as this way. The A10-4600M Trinity is "only" 20% faster in average compared to the HD 4000, but the flack the HD 4000 gets is because of the times where it performs drastically below the average level. That's because there's likely an area in the HD 4000 architecture where its especially weak against Trinity's iGPU. So if that's say due to say, fillrate, and it needs 3x the fillrate to eliminate those weak scenarios, it'd be worth it to do it.

Likewise, "aiming" it at GT 650M levels but not having critical spec parity would mean it would be especially weak in those same scenarios.

For example-

HD 4000: http://www.notebookcheck.net/Intel-HD-Graphics-4000.69168+M5381aa68cbb.0.html?&?recache=true
GT 650M: http://www.notebookcheck.net/NVIDIA-GeForce-GT-650M.71887.0.html

Starcraft 2: HotS 1024x768 Low: 16% advantage for the GT 650M
1366x768 Medium: 117% advantage for the GT 650M
1366x768 High: 196% advantage for the GT 650M
1920x1080 Ultra: 229% advantage for the GT 650M

What's only a ~2x advantage in 1366x768 Medium grows to over 3x in 1920x1080 Ultra.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,264
5,116
136
That old table does not have the 4770R. 65W is part of the GT2 line.

Intel made a design switch at the last minute? Really?

That's the rumour that's going around, yes.

EDIT: Although it's more of a marketing switch than a design switch. It's the same chip that's going into the GT3e 4 core notebooks, obviously, just soldered onto a desktop motherboard and being allowed to run at significantly higher TDPs. Changing which SKUs they are going to ship is of course more realistic than doing a genuine redesign at the last minute.
 
Last edited:

uribag

Member
Nov 15, 2007
41
0
61
I thought that it was common sense 3DMark performance # Real Gaming performance.

Until I see game benchmarks and IMAGE QUALITY results ... (2,5 x crap) = crap
 
Last edited:

MightyMalus

Senior member
Jan 3, 2013
292
0
0
In 3DMark11(All this post refers to 3DMark11 only!)

My A10-5800K gets a score of P1407, 1275 in graphics score, 3899 physics score and a 1194 combined score. Non-overclocked CPU or GPU, 8GB's of RAM at 1863MHz(cpu-z) with loose(non optimized) timings.

While the 3770K HD4000 in the guru3d.com review gets a gpu score of 654 and a pscore of 771. Running 8GB at 1600MHz, it seems(?)

legitreviews.com also has it with a pscore of 776 and gpuscore of 658(?)...picture is tiny. And its running at 8GB RAM 1866MHz.


4770R looks like a 290% increase over 3770K. GPU score in the 1900's. Putting it above estimated Richland scores.

Personally, I find this hard to believe. This embedded cache must be amazing!
But something is fishy...is the 4770K a GT3 chip? No, right? It has a 75% increase vs the 3770K...by adding just 4 more EU's, that's 25% more hardware converting to 75% more performance?

Was the EU's architecture changed that much, if at all? I can't find any data on iGPU changes from Ivy to Haswell. And this is supposed to be a GT2, no cache, no 40EU's and no new "4th gen Intel graphics". Seems a BIT unreal!

Then compare the "HSW-R" with the "HSW-K", the "R" is scoring like 120% higher! So now we have 100% more hardware but a 120% increase, this is better, the cache might be the extra boost, as we all know, memory matters.

But what explains the above? 25% more hardware = 75% increase? eh...thoughts? What am I missing?
 

mikk

Diamond Member
May 15, 2012
4,168
2,204
136
Was the EU's architecture changed that much, if at all? I can't find any data on iGPU changes from Ivy to Haswell. And this is supposed to be a GT2, no cache, no 40EU's and no new "4th gen Intel graphics". Seems a BIT unreal!


25% more EUs on GT2 is not the only change. In case for the i7-4770k it has a 100 Mhz higher GPU frequency. Furthermore Intel tweaked the GPU. Non-slice units performance (Tessellator, Geometry Shader, Vertex Shader etc.) is doubled over Ivy Bridge. They added a Resource Streamer to minimize CPU/driver overheads. They also improved its texture sampler throughput by up to 4x for some sampler modes.
 
Aug 11, 2008
10,451
642
126
No Desktop chips in the form most think of has GT3. Only the R models, are the GT3 ones which is BGA.

Personally with all the news of PCs, especially Desktops being declared dead nowadays, I'm not sure why Intel needs more focus on there(traditional desktop) at all. If anything, the focus towards more mobile should be done with greater intensity. Since nothing is free, that naturally means less and less focus on the desktop.

Charity like business are cool and all but they don't last. :)

While I get angry every time I read that "PC is dead" garbage, I also don't really understand the gt3e on the desktop. As is evident from a lot of my posts, I am not a fan of igps on the desktop at all. If you want better performance it is much better to just add a discrete card, while the lowliest igp is good enough for the average user.

The only place I can see a good fit for this would be all-in-ones or htpcs. Main thing is I would expect it to be expensive as well.

Honestly, it is depressing in general that the most positive thread so far about has well is regarding the stupid igp. Come on Intel, if ipc is topped out, at least give us higher clocks or more cores.
 
Aug 11, 2008
10,451
642
126
Exactly this is the target.

Kind of surprising to see Intel target such a niche market though. I suppose they are one of the few segments of the consumer market that could be growing, actually.

Htpc/sff I can understand the purpose of, but AIOs don't really appeal to me. They look nice in a store, but are underpowered, hard to repair, difficult to upgrade, and expensive. Pretty much all the disadvantages of a laptop without being really portable.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Exactly this is the target.

Pff no offense, but you're wrong if you think intel is targetting any sort of desktop or HTPC with this technology. The intended market is mobile. Intel created all of this for mobile, and it just so happens that there's a slim market for desktop and they can sell a few there so why not? But that wasn't the target. That is why intel is focusing on graphics performance and efficiency - like it or not, whether you get angry about it or not, it's time to wake up and face the facts. Desktop is dying a very slow prolonged death - sales are dipping nearly 20% every quarter. The first 3 months of 2013 had zero growth for PC and had the worst sales recorded in 2 decades. The sales of desktop aren't pretty, and intel knows this.

That said, there are desktop variants (-R) which will have GT3E. I see these chips being used in AIOs like the iMac - these are actually selling pretty well. The iMac still sells fairly well, and AIOs such as the Dell XPS one has also sold decently. Customers like these types of systems, because they're portable, light, and the screen doubles for use with their macbook or ultrabook. Unfortunately demand for traditional "desktops" with the case and such isn't nearly what it used to be, except for enthusiasts. Additionally, the BGA versions have a side effect of being far more efficient than the LGA counterparts - some changes were needed to lower the TDP target. Going BGA was part of that.
 
Last edited:

mikk

Diamond Member
May 15, 2012
4,168
2,204
136
Pff no offense, but you're wrong if you think intel is targetting any sort of desktop or HTPC with this technology.

They do. The fact that this model is available in soldered BGA only should make it pretty obvious. This model is aimed for OEMs which should build all in one systems or HTPCs.
What is new in the desktop space is a i7-4770R aimed at the All-In-One market
http://www.nitroware.net/previews/275-4thgencore-haswell-graphics

The intended market is mobile.

It isn't. R-models are BGA intended for desktop. For mobile GT3e they have i7-4950HQ and i7-4850HQ. i7-4770R is for desktop. Of course BGA GT3e or GT3 in general was primary made for mobile in mind which now Intel decided to release it for desktop, nobody claimed otherwise.
 

NTMBK

Lifer
Nov 14, 2011
10,264
5,116
136
Kind of surprising to see Intel target such a niche market though. I suppose they are one of the few segments of the consumer market that could be growing, actually.

Htpc/sff I can understand the purpose of, but AIOs don't really appeal to me. They look nice in a store, but are underpowered, hard to repair, difficult to upgrade, and expensive. Pretty much all the disadvantages of a laptop without being really portable.

iMac, iMac, iMac. Apple would love to ditch NVidia parts in their iMac and just use integrated graphics to drive a "Retina" display.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
4770R looks like a 290% increase over 3770K. GPU score in the 1900's. Putting it above estimated Richland scores.

Personally, I find this hard to believe. This embedded cache must be amazing!

I think someone mentioned that the newest drivers raise the score by an additional 5%. It may be able to reach 2000.

With the GT3 on the 4770R, you are talking about 2.5x the increase in EUs, 2x or more increase in Sampler(fillrate basically) throughput, 2x increase in front-end, 7-8% higher clocks, and 3x the memory bandwidth.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Any info on GPU composition of Haswell mobile Pentium and i3 lines?
 

mikk

Diamond Member
May 15, 2012
4,168
2,204
136
I think someone mentioned that the newest drivers raise the score by an additional 5%. It may be able to reach 2000.


Compared to 15.26 or 15.28 drivers. Intel already used 15.31 drivers for their 3dmark comparison.
 

Haserath

Senior member
Sep 12, 2010
793
1
81
So how will discrete graphics do at this rate?

Amd&Nvidia: 60-80% performance increase every 24 months.

Intel: 50-200% increase every 12-14 months.

Haswell could already be close to the HD7750. What's interesting is TSMC's 20nm might not even be available for graphics until after Broadwell. I think the deal with Apple will bring the cost down much faster but will delay the launch of new products for many others on the new nodes.

Will Nvidia be the only discrete graphics maker in a few years due to their graphics being better subsidized by the professional market?
 

NTMBK

Lifer
Nov 14, 2011
10,264
5,116
136
So how will discrete graphics do at this rate?

Amd&Nvidia: 60-80% performance increase every 24 months.

Intel: 50-200% increase every 12-14 months.

Intel's rate of increase is impressive, but not sustainable. They improved the performance of the GT3e massively over GT2, but only by increasing the die size massively and adding in expensive eDRAM and an interposer. Ivy Bridge already used a large part of the die for graphics, and GT3 has ~2.5x this area.

It's a very impressive improvement, but it's more of a short term readjustment as opposed to a long term trend. Intel's graphics performance will probably settle into the same rate of improvement as AMDVidia (60-80% improvement with each node shrink), although their performance will likely be ahead of them due to their process advantages.

Will Nvidia be the only discrete graphics maker in a few years due to their graphics being better subsidized by the professional market?

The Xeon Phi is ready to give Tesla a good kicking in the HPC market, so I wouldn't be too confident about NVidia's high end prospects.