• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Intel Iris & Iris Pro Graphics: Haswell GT3/GT3e Gets a Brand

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
There is only one problem:
Intel will lose to the ARM SoCs. All of them will become powerful enough for casuals. So instead of buying a Haswell CPU they going and buy a product with Tegra X.
If there is no reason to buy discrete GPUs then there is no reason to buy Haswell.


I was just thinking this type of response was on it's way. Of course Tegra 4 is better than Haswell. Of course. Only from you sontin.
 
Last edited:

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
There is only one problem:
Intel will lose to the ARM SoCs. All of them will become powerful enough for casuals. So instead of buying a Haswell CPU they going and buy a product with Tegra X.
If there is no reason to buy discrete GPUs then there is no reason to buy Haswell.
I feel like "good enough for casuals" is a moving goalpost though. I mean, try running any modern app on the original iPhone. It would crawl. There's a reason why games starting looking like PapiJump and were only able to look like Infinity Blade 2, NOVA 3, or Real Racing 3 recently (and even then, NOVA 3 pushes the phone so hard that it cuts the battery life down significantly. My 4S actually heats up with extended play beyond what my body heat might contribute. The phone basically turbos the whole time).
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
There is only one problem:
Intel will lose to the ARM SoCs. All of them will become powerful enough for casuals. So instead of buying a Haswell CPU they going and buy a product with Tegra X.
If there is no reason to buy discrete GPUs then there is no reason to buy Haswell.
They can buy an Atom instead and maintain full backwards compatibility with all their software.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
The same slippery slope can happen in discrete GPUs. Once low-cost iGPU/APU products hit critical volumes and manage to revenue-starve the R&D engines for future discrete GPU products the game will be over for them as their amortized costs will spiral out of control as the volumes sold get lower and lower.

It already happened.

If we look at the volumes, they are smaller than years ago. Nvidia and AMD have lengthened the release cycles for CPU architecture (24+ months from now on), AMD had to resort to a GPGPU optimized architecture in order to dilute the costs with the CPU R&D, and dGPU prices are going up. Everything is pointing out for the kind of R&D squeeze you and Phynaz described.

If anything Maxwell and GNC2 are the last hurrah of the dGPU mainstream market. They are going niche after that.
 
Last edited:

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
The same slippery slope can happen in discrete GPUs. Once low-cost iGPU/APU products hit critical volumes and manage to revenue-starve the R&D engines for future discrete GPU products the game will be over for them as their amortized costs will spiral out of control as the volumes sold get lower and lower.

AMD's discrete gpus share the same tech/designs as their integrated gpus. As long as AMD is selling APUs, they'll have tech they can turn into discrete cards with relatively little R&D. Of course, that also means Intel could some day turn their IGPs into discrete GPUs if they really wanted to.

In fact, the day when low end discrete GPUs die out is the day that AMD's APUs suddenly become incredibly attractive, if they can maintain a graphics advantage over Intel. (not so sure a thing now as it was a few years ago, AMD is nearly 3 years behind on their original road maps for CPUs / APUs, a 2 generation slip when they were already half a generation behind)
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
AMD's discrete gpus share the same tech/designs as their integrated gpus. As long as AMD is selling APUs, they'll have tech they can turn into discrete cards with relatively little R&D. Of course, that also means Intel could some day turn their IGPs into discrete GPUs if they really wanted to.

Discrete still has some life left in it, as long as there is a performance discrepancy compared to iGPU and a large installed base of desktop users. The only unfortunate thing is that there are fewer gateways for new purchasers of discrete GPUs, as an increasing amount of consumers are buying tablets instead of desktop PCs as their gateway or first computing device. In prior years, gamers would upgrade their dell or HP PC's with a discrete card for gaming capability. There are far fewer of these types of users these days.

There's a large user base right now, so I think discrete is sustainable for the next 5 years or so. Assuming the performance disparity remains -- if intel keeps making the incredible gains that they are, the day in which discrete is obsolete may arrive sooner. But it will happen either way IMO. Maybe not in 5 years, but eventually. Perhaps sooner if intel pushes hard enough.
 
Last edited:

lagokc

Senior member
Mar 27, 2013
808
1
41
lagokc, the analogy you need to conceptualize here is one in which Phynaz has direct and personal experience with...namely the utter slaughter that low-cost low-performing (relatively) x86 CPUs from Intel did to the big-iron processor segment in the 90's.

x86 was like a wind-blown wild-fire in that market demographic and it left every legacy big-iron provider in bankruptcy or irrelevance (other than IBM and SUN, although one could argue neither are all that relevant now).

Now the point here is not that x86 was special or made magically disruptive market moves, because it wasn't. x86 was simply disruptive enough as to destabilize the tenuous economics of the existing big-iron market segment...and once it tipped that balance past the destabilization point where revenue could no longer fund enough R&D as needed to keep big-iron low-volume CPUs ahead of the performance advances low-cost high-volume x86 CPUs the big-iron market as it was then known was dead.

The same slippery slope can happen in discrete GPUs. Once low-cost iGPU/APU products hit critical volumes and manage to revenue-starve the R&D engines for future discrete GPU products the game will be over for them as their amortized costs will spiral out of control as the volumes sold get lower and lower.

It is a tale that is not unique to x86, it happens the world over in industry and nation alike. See what cheap labor in Asia and open markets via the World Trade Organization did to manufacturing in Europe and N.America. It is a rather generic and universal phenomenon.

The difference is that those big-iron CPUs required their own R&D. In the case of iGPU/standalone, any research in one inherently improves the other.

As long as AMD makes APUs their APU graphics component is their GPU research. The only difference is one has fewer stream processors and is built into a CPU and the other has a lot more stream processors and its own memory controller (granted in AMD's case they have different manufacturing processes to worry about for each but that's unique to them). nVidia is in the same situation once Tegra gets Kepler.

Any company that makes a DX11 iGPU has the technology to make a PCIe GPU if they wanted to, but no one does because they know it would be too difficult to break into a market dominated by nVidia and AMD. If those two failed to continue then another company would step in to fill the gap.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
The difference is that those big-iron CPUs required their own R&D. In the case of iGPU/standalone, any research in one inherently improves the other.

As long as AMD makes APUs their APU graphics component is their GPU research. The only difference is one has fewer stream processors and is built into a CPU and the other has a lot more stream processors and its own memory controller (granted in AMD's case they have different manufacturing processes to worry about for each but that's unique to them). nVidia is in the same situation once Tegra gets Kepler.

Any company that makes a DX11 iGPU has the technology to make a PCIe GPU if they wanted to, but no one does because they know it would be too difficult to break into a market dominated by nVidia and AMD. If those two failed to continue then another company would step in to fill the gap.

But will there be an appreciable installed user base of desktop PCs 10 years from now to keep this model sustainable? It's questionable. New purchasers don't really buy desktops as their first computing device. I got my start with a store bought PC from Best Buy many many years ago - eventually, I moved into the DIY world. I'm guessing that most of us entered the world of PCs in a similar fashion. Things just don't work like that anymore for new computer users.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
I suspect one of the reasons Intel cancelled Larrabee(graphics oriented MIC), and made it as a HPC product is because they didn't see themselves getting too much revenue off making discrete graphics parts.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Discrete still has some life left in it, as long as there is a performance discrepancy compared to iGPU and a large installed base of desktop users. The only unfortunate thing is that there are fewer gateways for new purchasers of discrete GPUs, as an increasing amount of consumers are buying tablets instead of desktop PCs as their gateway or first computing device. In prior years, gamers would upgrade their dell or HP PC's with a discrete card for gaming capability. There are far fewer of these types of users these days.

There's a large user base right now, so I think discrete is sustainable for the next 5 years or so. Assuming the performance disparity remains -- if intel keeps making the incredible gains that they are, the day in which discrete is obsolete may arrive sooner. But it will happen either way IMO. Maybe not in 5 years, but eventually. Perhaps sooner if intel pushes hard enough.

Desktops are already a small market, most PC gamers are likely on laptops. Those can have discrete graphics, but not upgrade-able ones generally.

But the discrete card is almost assuredly dead for anything under $200. The 7790 is roughly on par with a 7850 (basically the GPU in the PS4) and has a die size of 160mm^2. A core i3 ivy bridge has a die size of 94mm^2, and that already includes a GPU. The (recent historical) upper range of Intel processors goes up to 300 mm^2 for under $300, so Intel could put together a good enough cpu and in theory a gpu on par with anything under $200.

Trinity comes in at 246mm^2, but has a substantial gpu already, probably about equal in size to a Cape Verde chip. On the down side, even discounting memory speed, Cape Verde has way more shaders and higher clock speed than even the desktop 7660D HD.
 
Aug 11, 2008
10,451
642
126
Desktops are already a small market, most PC gamers are likely on laptops. Those can have discrete graphics, but not upgrade-able ones generally.

But the discrete card is almost assuredly dead for anything under $200. The 7790 is roughly on par with a 7850 (basically the GPU in the PS4) and has a die size of 160mm^2. A core i3 ivy bridge has a die size of 94mm^2, and that already includes a GPU. The (recent historical) upper range of Intel processors goes up to 300 mm^2 for under $300, so Intel could put together a good enough cpu and in theory a gpu on par with anything under $200.

Trinity comes in at 246mm^2, but has a substantial gpu already, probably about equal in size to a Cape Verde chip. On the down side, even discounting memory speed, Cape Verde has way more shaders and higher clock speed than even the desktop 7660D HD.

Look at any microcenter ad or newegg. There are plenty of pretty competent dgpus under 200 dollars. For 200 dollars you can get a HD7850. That has a tdp alone of over 100 watts. The most power efficient quad core cpus are the i5 at 77 watts.

Do you really think you can combine a 100+ watt gpu onto the same die as a 77 watt cpu?
 

erunion

Senior member
Jan 20, 2013
765
0
0
As long as AMD makes APUs their APU graphics component is their GPU research. The only difference is one has fewer stream processors and is built into a CPU and the other has a lot more stream processors and its own memory controller (granted in AMD's case they have different manufacturing processes to worry about for each but that's unique to them). nVidia is in the same situation once Tegra gets Kepler.

AMDs and Nvidia CPU/SoC divisions lose money, and that's with their GPUs being subsidized by discrete.

How could AMD and Nvidias CPUs possibly fund anything when they aren't making money?
 

erunion

Senior member
Jan 20, 2013
765
0
0
Look at any microcenter ad or newegg. There are plenty of pretty competent dgpus under 200 dollars. For 200 dollars you can get a HD7850. That has a tdp alone of over 100 watts. The most power efficient quad core cpus are the i5 at 77 watts.

Do you really think you can combine a 100+ watt gpu onto the same die as a 77 watt cpu?

Isn't that TDP for the whole card?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Isn't that TDP for the whole card?

Probably. That's one advantage of an iGPU. Opportunistic sharing of resources mean less power can be used. Also, integration brings another key power advantage. Circuitry is physically much closer together which cuts on the power its required for data to travel. You also don't need redundant circuitry for some aspects.

Of course, there are downsides, like performance. But there's techniques that can be used in iGPUs that's yet to come. Like having much lower latency and higher bandwidth connection to the CPU, or sharing of memory via Intel's Instant Access and AMD's counterpart hUMA. The latter of which console developers say is an inherent advantage over PC setups.

So there may be a day where with equal setups and die size, iGPU may have some advantage. Of course fastest dGPUs will win due to sheer amount of transistors in a larger die.
 
Last edited:

MightyMalus

Senior member
Jan 3, 2013
292
0
0
Now the point here is not that x86 was special or made magically disruptive market moves, because it wasn't. x86 was simply disruptive enough as to destabilize the tenuous economics of the existing big-iron market segment...and once it tipped that balance past the destabilization point where revenue could no longer fund enough R&D as needed to keep big-iron low-volume CPUs ahead of the performance advances low-cost high-volume x86 CPUs the big-iron market as it was then known was dead.

As pointed out, its happening with ARM.

There is only one problem: Intel will lose to the ARM SoCs. All of them will become powerful enough for casuals. So instead of buying a Haswell CPU they going and buy a product with Tegra X. If there is no reason to buy discrete GPUs then there is no reason to buy Haswell.

ARM already has won for the casuals. But what's really stopping ARM is software that can be used for work(And you can already use ARM for some work also, plus now with the whole web as the enviroment for tools even more so. And gaming yes, but that's improving and fast.

But dGPU's will always be needed IF the software is made to use them. And software will always use them. But they will be niche, like always. Remember Doom 3? Crysis? Game Developers sell cards, NVIDIA and AMD need to make software to show that there IS value with buying a dGPU.
 

erunion

Senior member
Jan 20, 2013
765
0
0
ARM already has won for the casuals.

ARMH has won nothing.

They existed in the ultra low power space that no one else wanted, then moved in smart phones virtually unchallenged. They've conquered nothing.

Now that the smart phones market is getting bigger, competition is moving in and ARMH has zero track record.
 
Mar 10, 2006
11,715
2,012
126
ARMH has won nothing.

They existed in the ultra low power space that no one else wanted, then moved in smart phones virtually unchallenged. They've conquered nothing.

Now that the smart phones market is getting bigger, competition is moving in and ARMH has zero track record.

Bingo.
 

MightyMalus

Senior member
Jan 3, 2013
292
0
0
ARMH has won nothing. They existed in the ultra low power space that no one else wanted, then moved in smart phones virtually unchallenged. They've conquered nothing. Now that the smart phones market is getting bigger, competition is moving in and ARMH has zero track record.

True, they won nothing. They OWN it. And now Intel, AMD and NVIDIA are desperate to get a hold of it. And I don't see that realistically happening in 2013 and probably 2014 neither. And by then, ARM has something they don't, quick improvements and a bunch of apps that the new generation of millions upon millions are used to.

ARM will dominate because it is inexpensive and everywhere. x86 is not and will not be.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
True, they won nothing. They OWN it. And now Intel, AMD and NVIDIA are desperate to get a hold of it. And I don't see that realistically happening in 2013 and probably 2014 neither. And by then, ARM has something they don't, quick improvements and a bunch of apps that the new generation of millions upon millions are used to.

ARM will dominate because it is inexpensive and everywhere. x86 is not and will not be.

"Never will be" Okay we'll re-visit this in a year. The only advantage ARM SOCs have is efficiency, and intel hasn't made headway in this space because their efficiency wasn't very good, and the only part WITH good efficiency was pretty pathetic (eg Atom). Those issues are largely fixed with their upcoming products, and Haswell/silvermont will make a small dent - while Broadwell will make a huge dent.

This is my opinion and I could be wrong. Everything depends on intel's pricing, and they KNOW what they need to do. Trust me, intel is not stupid.

One thing i've learned from computing is that stating "never" are famous last words. Of course anything can happen, yet when you consider the sole reason for intel's problems in the tablet space were efficiency related - consider that Haswell ULV matches ARM SOCs, while Broadwell will obliterate ARM SOCs in efficiency. This is ALL WHILE performing much, much better. We can take examples like the upcoming Tegra 4 with 96Gflops of performance. That is, quite frankly, god-awful. Intel can easily perform worlds better than even the best ARM SOCs while having the same or better efficiency with their newest products. So the only advantage ARM SOCs had is gone, while intel SOCs perform in the order of 10-20x better. I hope you can see the ramifications of this. Never say never. Those are famous last words. Intel will have products in 2014 that beat ARM SOCs by every metric imaginable, and intel knows what they have to do. Price them to destroy ARM SOCs. If intel wants to do it, and they're not stupid - they will.

Stating "never" is just outright arrogance which has earned some silicon companies irrelevancy status. I wouldn't doubt intel and what they can do by throwing tons of R+D cash at a problem - intel has more R+D cash than anyone else in the business. "Intel will never catch out Athlon x64 architecture". It happened, and AMD is still paying for their hubris and laziness. Certainly, I can be wrong about all of this but I really do think intel will put up a serious fight especially in 2014.

The key points here are that intel knows what needs to be done, and they now have architectures that beat ARM SOCs in every metric. As soon as the price is right (and intel knows they must do this), that can certainly shake things up.
 
Last edited:

MightyMalus

Senior member
Jan 3, 2013
292
0
0
"Never will be" Okay we'll re-visit this in a year. The only advantage ARM SOCs have is efficiency, and intel hasn't made headway in this space because their efficiency wasn't very good, and the only part WITH good efficiency was pretty pathetic (eg Atom). Those issues are largely fixed with their upcoming products, and Haswell/silvermont will make a small dent - while Broadwell will make a huge dent. This is my opinion and I could be wrong. Everything depends on intel's pricing, and they KNOW what they need to do. Trust me, intel is not stupid. One thing i've learned from computing is that stating "never" are famous last words. Of course anything can happen, yet when you consider the sole reason for intel's problems in the tablet space were efficiency related - consider that Haswell ULV matches ARM SOCs, while Broadwell will obliterate ARM SOCs in efficiency. This is ALL WHILE performing much, much better. So the only advantage ARM SOCs had is gone, while intel SOCs perform in the order of 10-20x better. I hope you can see the ramifications of this. Never say never. Those are famous last words.

I'll reply backwards. I actually never said "never".

But I will say this, Do you seriously think all tablet companies, all TV companies and all phone companies will support x86? Or better yet, do you think Intel would allow the use of x86 by everyone? That won't happen.

ARM has an advantage. A big advantage. And its not performance. Every device has its usage and ARM is more than enough for massive consumption. x86 is used for work and high end gaming and compute and things that consumers don't care about.

Apple will not use x86, Qualcomm will not use x86 and NVIDIA will not use x86.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I'll reply backwards. I actually never said "never".

But I will say this, Do you seriously think all tablet companies, all TV companies and all phone companies will support x86? Or better yet, do you think Intel would allow the use of x86 by everyone? That won't happen.

ARM has an advantage. A big advantage. And its not performance. Every device has its usage and ARM is more than enough for massive consumption. x86 is used for work and high end gaming and compute and things that consumers don't care about.

Apple will not use x86, Qualcomm will not use x86 and NVIDIA will not use x86.

Nobody cares about the instruction set. You're in like, 1999 mode if your primarily concerned about the instruction set - As far as real consumers go, all they care about is the operating system and experience. Actually, let me correct that, they only care about the experience. FYI, intel can now run android and iOS like everything else can. And to state "nvidia will not use x86", No joke nvidia won't use x86. Intel denied them an x86 license when they practically begged for it (and went to court over).

I don't think you quite understand how meaningless the underlying instruction set is. Intel chips run android just like ARM SOCs do. Intel SOCs can do anything ARM SOCs do. They can run any OS that ARM SOCs do. So your comment being focused on the instruction set is absolutely ridiculous. Nobody cares about that. Like I said, intel will have better performance, efficiency and the same capability as all ARM SOCs. If the price is right, intel will disrupt that entire market.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
AMD HD7750/70 die size = 123mm2 at 28nm process.
price start at $110/$150

Intel Core i7 4770R = ~200-220mm2 at 22nm process
Price start at $350+

You can buy a 4570K + HD7770 and still have double the performance at the same price as 4770R. Why people believe that $100 discrete GPUs will be faced off ?? And we havent even touched the 20nm Discrete GPUs that are coming in 2014. Ohh and dont forget that HD7770 was released in February 2012, 18+ months before Core i7 4770R.

iGPUs will never catch Discrete GPUs in performance(at the time of ralease) simple because Discrete GPUs are not standing still, they evolve as well.
 

NTMBK

Lifer
Nov 14, 2011
10,456
5,843
136
I don't get why people keep forecasting the death of discrete.. the bar gets shifted, not chopped.

It's a gradual shift. Take a look at what happened in laptops. First in Sandy Bridge Intel made integrated graphics good enough to kill the market for lower powered laptop graphics parts. Then they improved again with Ivy Bridge, and again with Haswell, and now even pretty strong midrange graphics parts like the GT650m are becoming irrelevant. Broadwell is coming with another process shrink next year, and you can bet that they'll use a lot of those new transistors to bulk out the graphics parts again. What's next to go? The 660m? And all of a sudden, NVidia is left with a market for only the Alienware-level parts- a tiny fraction of the market.

How many more chunks of the market can Nvidia stand to lose to Intel, before their market share is so small that they can't fund their R&D any more?