• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Intel Iris & Iris Pro Graphics: Haswell GT3/GT3e Gets a Brand

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
It was expected with the Macbook Pro. Its amazing to see the pace that IGPs kills of discrete GPUs in batches.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
It was expected with the Macbook Pro. Its amazing to see the pace that IGPs kills of discrete GPUs in batches.

It will only get worse with Broadwell (GT1, GT2, GT3 and GT4 SKUs). Have you guys seen this?



Is that Broadwell?
Broadwell GT4: 96 EUs (Gen 8), 2 Teraflops @ 1 GHz
In comparison Haswell GT3/GT3e (40 EUs Gen 7) @ 1.3GHz has 832 Gigaflops of shader performance.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Its not fun being a discrete GPU maker. Thats for sure.

The death of discrete GPUs might come sooner than some people think/hope. IGPs like Haswell and the upcoming Broadwell simply pulls out teeths revenue wise for the discrete makers.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Its not fun being a discrete GPU maker. Thats for sure.

The death of discrete GPUs might come sooner than some people think/hope. IGPs like Haswell and the upcoming Broadwell simply pulls out teeths revenue wise for the discrete makers.

This is not something i'm looking forward to. As much as I think that intel's current direction makes sense, the desktop PC user in me doesn't like this at all. I want nvidia and ATI to strive and continue to develop discrete graphics, i'd hate to see dGPU become a dying breed. Ugh. Although I don't think they will outright die. If anything, the price of entry will probably go up.

As a mobile consumer, I like the increased battery life from the lack of a dGPU. On the other hand - as a desktop user - I'm not liking this. Not at all. I want nvidia and AMD to get their chips in mobile products, I want them to continue development of discrete. Ugh.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
This is not something i'm looking forward to. As much as I think that intel's current direction makes sense, the desktop PC user in me doesn't like this at all. I want nvidia and ATI to strive and continue to develop discrete graphics, i'd hate to see dGPU become a dying breed. Ugh. Although I don't think they will outright die. If anything, the price of entry will probably go up.

Sooner or later you hit a wall revenue wise. And from that point on its not profitable to develop discrete GPUs. Its not coming tomorrow, but it is out there in the future.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Sooner or later you hit a wall revenue wise. And from that point on its not profitable to develop discrete GPUs. Its not coming tomorrow, but it is out there in the future.

Well, it's an eventuality but I think it will take more than 5 years to get there. Perhaps longer. In the meantime, let's just hope that the PC gaming market remains robust. Currently, it's still doing very well - let's face it, a desktop user playing at 1080p will probably buy discrete. While the iGPU is fantastic for mobile, it just doesn't translate into usability for 1080p gaming.

Either way, I see dGPU prices going up across the board because of this in coming years. Not cool.
 

crashtech

Lifer
Jan 4, 2013
10,695
2,294
146
Enthusiast GPUs will become a niche, just like our CPUs are already becoming. So those of us who can still afford the hobby will eventually be running server-derived hardware paired with very expensive GPUs that will likely be derived mostly from HPC parts, like the Titan.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Enthusiast GPUs will become a niche, just like our CPUs are already becoming. So those of us who can still afford the hobby will eventually be running server-derived hardware paired with very expensive GPUs that will likely be derived mostly from HPC parts, like the Titan.

Unless that dies off to Phi and likes.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
I wonder how many dies Broadwell will have.
Dual-core or quad-core; GT1, GT2, GT3, GT4; with eDRAM & without eDRAM... Lots of options.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I wonder how many dies Broadwell will have.
Dual-core or quad-core; GT1, GT2, GT3, GT4; with eDRAM & without eDRAM... Lots of options.

If I had to guess, i'd say that there will be more variants than even the 4th generation offers. Now intel isn't only making SKUs to differentiate CPU performance, but GPU performance as well.....and Broadwell will have even *more* GPU variants....

Hopefully intel will make the naming scheme as confusing as possible. ;)
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
Unless that dies off to Phi and likes.

Phi is essentially incapable of rendering any kind of meaningful graphics. It's an x86 processor with lots of cores, not a GPU.

I think Intel can hurt Nvidia and AMD's bottom line with better IGP's, because as we can see now, $75-100 cards aren't looking like great value compared to APU's and better integrated graphics. I think this will lead to less low-end spam that we see from AMD and Nvidia, with a more concrete focus on medium-high end cards. dGPU's are far from dead. The low end ones just don't make as much sense now.
 
Mar 10, 2006
11,715
2,012
126
Phi is essentially incapable of rendering any kind of meaningful graphics. It's an x86 processor with lots of cores, not a GPU.

He was talking about HPC accelerators. Xeon Phi is going to make life much, much harder for Nvidia :/

I think Intel can hurt Nvidia and AMD's bottom line with better IGP's, because as we can see now, $75-100 cards aren't looking like great value compared to APU's and better integrated graphics. I think this will lead to less low-end spam that we see from AMD and Nvidia, with a more concrete focus on medium-high end cards. dGPU's are far from dead. The low end ones just don't make as much sense now.

Agreed. dGPUs will continue to be very profitable - say hello to price increases and a mix shift upward, though.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
He was talking about HPC accelerators. Xeon Phi is going to make life much, much harder for Nvidia :/

Well, Phi is a separate topic, but that very much remains to be seen still. There is a lot yet to be proven about Phi.

Agreed. dGPUs will continue to be very profitable - say hello to price increases and a mix shift upward, though.

Well, I think it will just change the range of dGPUs, not necessarily the price. Of course, we could still have Titan 2.0's and other silly-expensive cards, but they would hopefully offer the performance to match their price tags given a more concrete baseline for the bottom end.
 
Jun 8, 2013
40
0
0
Agreed. dGPUs will continue to be very profitable - say hello to price increases and a mix shift upward, though.

It has already started on the AMD side at retail with the 7000 series with no 7450, 7550 and 7650 cards. Although AMD use their GPU tech in their APU's so they won't be affected so badly. It is Nvidia that will be squeezed the most.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Agreed. dGPUs will continue to be very profitable - say hello to price increases and a mix shift upward, though.

Sure, but then again, HPC products are profitable as well. That doesn't make for great consumer level products, this will likely put dGPU out of reach of the average PC gamer, as the typical PC gamer spends 200-300$ max on a GPU. If the pricing trends upwards due to iGPU squeeze, that will just shift users over to consoles - There aren't many gamers who are truly excited about spending 600-800$ on a single GPU.... I would rather nvidia and AMD get mobile dGPU contracts as to provide R+D funding for cheaper and better dGPUs. Sadly, it appears that won't happen - 5 years from now dGPU will probably be very expensive.

Which is why i'm not looking forward to this at all, although it will take 2-3 years to really settle in.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Phi is essentially incapable of rendering any kind of meaningful graphics. It's an x86 processor with lots of cores, not a GPU.

I think Intel can hurt Nvidia and AMD's bottom line with better IGP's, because as we can see now, $75-100 cards aren't looking like great value compared to APU's and better integrated graphics. I think this will lead to less low-end spam that we see from AMD and Nvidia, with a more concrete focus on medium-high end cards. dGPU's are far from dead. The low end ones just don't make as much sense now.

The real problem is that that intel iGPUs eat into nvidia/AMD mobile dGPU revenue. mobile dGPU sales are arguably more important than desktop - it has been a very profitable segment in past years. This means that AMD/nvidia get less revenue for which to fund desktop dGPU R+D. This means an eventual slowdown in technological progress and higher prices across the board - we don't want this happening. But it probably will in 5 years or so.
 

crashtech

Lifer
Jan 4, 2013
10,695
2,294
146
The real problem is that that intel iGPUs eat into nvidia/AMD mobile dGPU revenue. mobile dGPU sales are arguably more important than desktop - it has been a very profitable segment in past years. This means that AMD/nvidia get less revenue for which to fund desktop dGPU R+D. This means an eventual slowdown in technological progress and higher prices across the board - we don't want this happening. But it probably will in 5 years or so.
This is why I'm guessing that future enthusiast dGPUs may be made with dual purpose silicon, fused one way for HPC and another for GPU. The HPC side will presumably help fund R&D.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
This is why I'm guessing that future enthusiast dGPUs may be made with dual purpose silicon, fused one way for HPC and another for GPU. The HPC side will presumably help fund R&D.

HPC doesn't have scale to fund anything. Nvidia bread and butter in the professional market is in workstations, not in HPC. Tesla is just a blip. Given the decoupling of the consumer product line from the computing line, I'm really interested in seeing Nvidia's next moves.
 

Nothingness

Diamond Member
Jul 3, 2013
3,310
2,383
136
If I had to guess, i'd say that there will be more variants than even the 4th generation offers. Now intel isn't only making SKUs to differentiate CPU performance, but GPU performance as well.....and Broadwell will have even *more* GPU variants....

Hopefully intel will make the naming scheme as confusing as possible. ;)
Add into the mix some castrated, err segmented variants, plus Apple models. I hope everyone is familiar with ark.intel.com, that'll be very needed to choose :biggrin:
 

erunion

Senior member
Jan 20, 2013
765
0
0
It will only get worse with Broadwell (GT1, GT2, GT3 and GT4 SKUs). Have you guys seen this?



Is that Broadwell?
Broadwell GT4: 96 EUs (Gen 8), 2 Teraflops @ 1 GHz
In comparison Haswell GT3/GT3e (40 EUs Gen 7) @ 1.3GHz has 832 Gigaflops of shader performance.

wow. No wonder Intel made their edram chip so big. More than doubling the EUs from HSW gt3.

does this mean broadwell gt2 will be 24 EU or 48?
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
wow. No wonder Intel made their edram chip so big. More than doubling the EUs from HSW gt3.

does this mean broadwell gt2 will be 24 EU or 48?

Yes, I wonder whats the size of this GT4 96 EUs GPU @ 14nm (assuming it is Broadwell).
~2.5x the shader performance of Haswell GT3e (2 Teraflops) would be insane for an IGP.
A 48 EUs GT2 would be great but I'm guessing 12, 24, 48 and 96 EUs for GT1/GT2/GT3/GT4.
 
Last edited:

erunion

Senior member
Jan 20, 2013
765
0
0
24 EUs make the most sense. i dont think I understand the terminology in the slide. Says gt2 is right half. Half of what? Oh, right half of GT3, maybe?
 

crashtech

Lifer
Jan 4, 2013
10,695
2,294
146
I'm still trying to figure out if Iris Pro is gonna be on the desktop at all, hopefully in the form of a mini-ITX board with BGA i5 onboard. That would be excellent for the build I'm planning.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
24 EUs make the most sense. i dont think I understand the terminology in the slide. Says gt2 is right half. Half of what? Oh, right half of GT3, maybe?

I suppose so... and assuming we have 4 graphics sub-slices (24 EUs each) then GT1 would have 12 EUs (slice half). According to Anand Haswell GT3 implements 4 graphics sub-slices (4x 10 EUs).