Do you think Intel could destroy AMD in making video cards?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Could they do it? Sure.
Will they make enough money to justify the cost and/or opportunity cost for something else? Probably not, or they would have done it already.
 

Imperium

Junior Member
Sep 15, 2014
5
0
16
intel can beat everybody, they just allow others to survive


Pretty much true. It has reached a point where if Intel wanted to really do discrete graphics, it is simply easier to buy nvidia or AMD out of petty cash. Nvidia is tempting, as it complements Intel. AMD might get regulatory flack, unless AMD sells ATI alone for cash to fix its financial problems.
 

JeffMD

Platinum Member
Feb 15, 2002
2,026
19
81
Pretty much true. It has reached a point where if Intel wanted to really do discrete graphics, it is simply easier to buy nvidia or AMD out of petty cash. Nvidia is tempting, as it complements Intel. AMD might get regulatory flack, unless AMD sells ATI alone for cash to fix its financial problems.

No, I do not think it is that cut and dry. When you look at how bad their integrated graphics have performed, they are either not cut out for it, or simply don't care.

Also it is very hard to be a 3rd wheel. Look at the console market, it has always ended up with the top 2 doing very well and the third one looking like a land bound fish.

And while there is merit behind saying APUs are the next big thing and will tie up the mainstream market and make billions, the pc gamer market isn't like that. Non gamers will still buy cheap computers that will have underpowered cpus and apus, and pc gamers will seek out upgradeable and more powerful solutions. Just because you make a great performing apu does not mean people will buy it.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Close?

A gtx770 or R285 class is now mid-range, or even towards low-end due to the prices at <$200.

how can you possibly say those cards are close to low-end? Price is not indicative of the level of performance one gets. If it were, something like a Titan Z would have been many times faster than a 780.

Going from a 770 to the fastest single gpu out is only about twice the performance at most.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Because it doesnt make sense. Same reason why AMDs own APUs lose in cost/performance to its own CPU+GPU options.

Iris Pro is for situations where low power consumption and small size is the key.
laptops bro. if it can handle games out right now at native res(laptop res) and medium settings, I would get one for sure.
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,341
264
126
No need. Intel, along with AMD and NV, need to focusing on HSA along with what they are already good at. There's room for two, and maybe all three. Right now I have this awesome Titan, and awesome 5960X in my system. But they can't worth together without the massive slowdown of transferring data between the two. HSA is going to make everything better.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,597
6,076
136
That's a very good insight. Thanks for your smart input! And checking your claims, you are right: the mobile/handheld market is booming compared to discrete GPUs!

And will only continue to grow.
 

DA CPU WIZARD

Member
Aug 26, 2013
117
7
81
apus are the future. once it can run games in 1080p on med settings, it will be main stream for sure = market share.

most apus can run games in low settings in 1080p now right?


The APU is designed to nullify the ultra low end video card marketplace. The average mainstream user hasn't needed video cards for years now. The only people who benefit from APUs are those with ultra low budgets or those who only play outdated games such as WoW.

By the time APUs are advanced enough to play current modern games at medium 1080p 60 FPS, 1080p will have been replaced by 1440p / 4K, and games will have advanced so much that "medium" is the equivalent of today's "high" graphical settings. Essentially, the APU will never catch up to an acceptable level for modern gamers, but they are beneficial for the mainstream market.
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
Larrabee 2.0 could be interesting, especially if Intel can consistently maintain a 2 process node advantage ahead of AMD and Nvidia. Imagine a giant GPU 600mm² on 14nm with their current IGP process tech blown up with high memory bandwidth and texture fill...I mean I'm sure it would be at least as good as a GTX 770.

I see what you did there, lol
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
Pretty much true. It has reached a point where if Intel wanted to really do discrete graphics, it is simply easier to buy nvidia or AMD out of petty cash. Nvidia is tempting, as it complements Intel. AMD might get regulatory flack, unless AMD sells ATI alone for cash to fix its financial problems.

Maybe Intel buys ATI and Nvidia buys AMD?

That wouldn't be too bad. Both Intel and Nvidia could improve on what they would be buying.

Now with APU's in the middle though, it would be a mess to defuse the fusion!
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
AMD can't be bought out and have the CPU IP license stay in place so Nvidia would be buying IP they can't use. The graphics IP can't be separated from the x86 stuff at this point anyway. Intel should just buy AMD it makes a whole lot of sense, Intel doesn't have to worry anymore about licensing AMD64 and they get all the GPU stuff. Imagine a Radeon built with Intel's fab process, I'd buy it.
 

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
They already tried and failed. That experiment though lead to them creating better internal gpu's.

I think the entry would be too costly for what the profit is. If the annual profit pie is say 2 billion and now its divided between AMD and Nvidia, if Intel comes in it would be divided 3 ways, highest would likely still be Nvidia, then AMD and Intel is likely to have small 5-10% share of the market.

Not worth enough.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Intel's iGPUs are actually pretty good. If they did choose to scale them upwards on it's own discrete GPU die, I'd wager they'd do very well. A whole heck of a lot better than Larrabee did. That is from a performance perspective. Now Intel is all about 50% or better margins so, who knows but them if they'd be able to make that happen.
 

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
Reopening thread. Posts here need to be focused on the discussion topic, and not on attacking one other. It won't be a warning the next time I see it.
-- stahlhart
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
We had a very similar thread about this not to long ago.

Although the financial reasoning sounds pretty sound, it wasnt very long ago Intel had full intent to enter into the dgpu market. Not only was the intent there, Intel actually invested heavily in what would end up being a massive blunder. The truth is that Intel had a real drive and planned to take the dGPU market all to themselves.

But what went wrong?

There is a truth to the IP roadblocks that some people have brought up. But surely intel has enough talented engineers they should be able to out class everyone else. Well, it is much more complicated.

There was some major issues that made making a graphics chip without using IP almost impossible. The graphics market was built over time. And it was built along with DX. HW was built around DX and makes creating a whole new graphics design without infringing becomes very very tricky. This HW must work inside very established boundaries.........
or so you would think

This brings me to the biggest fault with the larrabee project there ever was. Ambition that exceeded rationality. Intel decided they would completely and radically convert the entire graphics industry using a 100% programmable x86 cores GPU. If they actually succeed with this insane plan, they would have cut out Nvidia out forever. Nvidia had no access to x86.

It was a command from higher management, build a GPU from x86 cores. This GPU would work unlike anything ever in the graphics space. But this wouldnt stop intel. They were lobbying hard to sway the entire industry towards this new direction. But not only was this a wild ambitious and impractical plan, the truth is that GPUs built on x86 cores were not well suited for the current GPU task at all. Sure they were able to show off high flops and even ray tracing. They even complied a game (quake I think) that could run on a larrabee chip. I mean, the performance in certain task was good. Its just that these chips made really really bad GPUs, as in the GPU as we knew it. Emulating everything with x86 was extremely slow and ineffectual. They never became DX or OpenGL compliant. This is the paradox of larrabee.

It was a very bad idea from the start. A radical shift is one thing but this shift didnt even look like it had any real kind of pay off. You cannot expect the entire industry to drop everything they were doing and go in a direction of the unknown, especially with the progression that had happen since the dawn of fixed function HW. Larrabee was not the future of gaming, it wasnt even the future of intel graphics. For that, Intel management had a complete and total change of heart.

It should be no secret that Larrabee was pushed out of a huge fallout between Intel and Nvidia. One that was bitter and fierce. This fight started behind closed doors but I believe the rumors of Intel looking to buy Nvidia is where it all started. There were very early rumors that leaking out before AMD bought ATI. In this very thread people were posting about AMDs purchasing ATI and whether it was the right move. But if we look at the back stories, things look a lot different. Many might remember the rumors of AMD looking to buy Nvidia, this was supposedly way before AMD bought ATI. AMD actually got nowhere far with Nvidia but most people dont realize that Intel was looking into these companies after AMD was trying to work out a deal. This could have been how ATI managed to get such a hefty sum out of the deal. AMD was afraid they would loose their planned edge and went all out to make it happen. AMD buys ATI in 2006. Most people dont realize that this purchase had an immediate effect on Intel/ATI dealings which pertained to chipsets and IGPs (Intel announced SiS IGP chipsets).

It wasnt to long after that, Intel set out to stop Nvidia from making chipsets and IGPs for their chips. Nvidia was cut out. I believe this relationship was under strain because intel's attempt to strong arm Nvidia. What would intel want with nvidia? Because at the time they took AMDs fusion/vision very very seriously. So in the middle of all this drama, Larrabee became a top priority and a real headliner in the media. There was a real battle happening, a power struggle......or better, intel through a power fit.

But Larrabee became laughabee and intel upper management had a real change of heart. They were forced to enter a cross licensing deal with Nvidia. And this deal gives the specialized IP and technology which can be found inside of intel's integrated graphics chips today. After the deal with Nvidia, Intel was able to put a gpu on the same die as the CPU much faster than even AMD could. Intel rushed to get ahead, not realizing that AMD was actually well behind them. Nonetheless, Intel beat AMD to the punch and AMD's fusion was never became the huge advantage it could have been.

Often the back story is very important in conversations relating to technology. Now a one thing should be clear even if my version of the history is my own perception as i remember it. With all the wealth in CPUs, Intel did go after the dGPU market in the past. It was an expensive effort that failed. So this market was worth it to them in the past, they wanted to corner it.

Intel has since created a very efficient igp. It is very respectable. But there are many who dont seem to realize that Intel is paying Nvidia a very hefty sum for IP and technologies. The fine details are not made public but at the time of the agreement Nvidia was into very little things except GPUs. After Intel failed with larrabee this license agreement with nvidia emerged. Then after this cross license agreement, Intel igps drastically improved. I do not see how there are people who cannot accept these things are connected but personally it is very very clear to me. Nvidia has spoke some on this cross licensing and have verified that although Intel isnt an Nvidia design, Nvidia technologies exist inside of Intel IGPs such as the one in Sandy Bridge. We also Know that as part of this deal, Nvidia is not allowed to make a CPU that can run x86 code. This is important because if you know how the Denver CPU works, it would/could have been used on x86 like it does ARM today.

I remember when the sum of the Intel/Nvidia cross license deal went public, people were saying that AMD should have fought for a better deal. But AMD agreed to settle an anti trust lawsuit while Nvidia and Intel signed a large cross license agreement. Nvidia of course dropped the lawsuit as well but the two are different scenarios

Nvidia/Intel cross licensing totally prevents Nvidia from perusing an x86 capable CPU, does anyone think that Nvidia would so dumb to not have provisions that keep intel from making dGPUs with their technologies? Both of these companies are very arrogant and fierce. Nvidia protects its technologies and hold them very very close to their hearts. The intel IGP as we see it today will not be made into a dGPU, there is almost no chance of this happening.

I just want to end this with something that I think both Intel and AMD have learned since they started dabbing into GPU businesses. They learned that GPUs are very very different that CPUs. They are anything but simple. I think both of them (from a management standpoint) had completely underestimated the complexity and were totally blind to how advanced they really were. There is little chance Intel could build a dGPU that will work in current games without using existing technologies. Heck, Nvidia and AMD are the backbone behind DX, it is an extension of their HW. Sure, nothing is impossible. But intel really really tried with larrabee. They put billions into the project. They went back to the drawing board many times, completely redesigned larrabee many times. They had massive engineer teams around the world working on it. It drastically changed from a pile of CPUs to something more like an x86 GPU. Many engineers walked off the project, even high profile people just left. There were many many issues that were extremely difficult to work out. They never were able to after billions of dollars

See GPUs as we know it were built over time. Extremely complex designs built on top of even more complex designs. Years and years in the making. There used to be GPU makers in the double digits and this filtered down to the few we have today. Current designs are on an unimaginable scale. Built on foundation after foundation. We are so far now that starting from scratch with a completely different route is next to impossible. Especially if you want them to work within the boundaries that graphic cards must work in today (DX, OGL). It is just not a reasonable quest. Intel at least proved that much with larrabee.

So Yeah, sure intel could make a dGPU.....
But look at how we got here and see if you think its a real possibility
 

rootheday3

Member
Sep 5, 2013
44
0
66
disclaimer: I work at Intel in the graphics team, but my opinions are my own...

@ocre

You are correct that Larrabbee was overambitious and that for it to have succeeded it would have to have had top notch perf and perf/w running DX and OGL at competitive prices vs AMD/NVidia cards AND then offered the fully programmable rendering pipeline as a bonus. That was an unachievable dream.

However, your assertion that the Intel's progress in IGP is due infusion of NVidia IP is incorrect. Intel's IGP is 100% internally developed (I am talking about the "Gen" architecture used in Sandybridge, Ivybridge, Haswell, Baytrail, not the phone chips like Merrifield that use Imagination PowerVR).

Yes, Intel is making payments to NVidia but that is basically "protection" money - Intel develops its own GPUs and, as you say, it may independently reinvent something that NVidia already has patents for. To avoid having the risk of the $50B PC business held hostage to NVidia lawsuits all the time for patent infringement, Intel pays money to NVidia.

All of the improvement you have seen in Intel graphics in the last several years has been because Intel management continues to take graphics, media, and parallel computing very seriously and has made sizeable investments in headcount/R&D on HW and SW as well as the die area allocated to graphics.

While I can't disclose Intel roadmaps, you can expect to see sizable jumps in graphics performance at each power envelope in coming years.

As others have said, I don't expect Intel to enter the dGPU market for both business reasons and technical reasons. The business reasons:

1) Trying to break into the dGPU business would take a lot of work; the 3 way battle for market share would put downward pressure on margins for all 3 companies. With Intel being the new entrant and having to prove itself, it would have the most price pressure. Also, remember that margins on dGPUs die are lower than CPUs and the dGPU dies have to be integrated into the cards + DDR5 + connectors, fan, etc. Intel's portion of the profits would just be on the die and it would have to build partnerships with the card makers - an effort parallel in scope/complexity to the work it has to do today with OEM/ODMs on motherboards/laptops for CPUs. Where would all the extra headcount for doing that ecosystem work come from?

2) Intel's market share in PC graphics is steadily going up without dGPU. There are two reasons: form factor and the good enough effect. Adding a dGPU to a laptop makes it thicker, makes the board design and cooling more complicated. Ditto for small form factor desktop (NUC) and All-in-Ones. Traditional desktops are a small and shrinking part of the overall market. Adding a dGPU to a tablet or phone is simply not possible.

Re "Good enough" - People say "iGPUs will never be good enough" - well they may not be for the ultra enthusiast gamer, but for the vast bulk of corporate and consumer users, they are already. For casual and mainstream gamers, remember that the bar on GPU perf required for resolution/image quality is generally set by console ports - it would only take ~2-3x jump from current Intel iGP performance to match current gen consoles. Wanna bet that happens well before the current console generation ends?

What about the rise of 4K? We may get 4K displays for really sharp text, movie viewing (mostly fixed function decoders), etc but you could choose to run games at 19x10 without undo scaling issues. And many users would be happy to lower the image quality settings if it let them run games at reasonable frame rates on their laptop without having to have a separate gaming desktop.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
There is a certain few posters that this will shut up, if you are who you say you are. I guess the evil mods will have to verify?
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
Yes, Intel is making payments to NVidia but that is basically "protection" money - Intel develops its own GPUs and, as you say, it may independently reinvent something that NVidia already has patents for. To avoid having the risk of the $50B PC business held hostage to NVidia lawsuits all the time for patent infringement, Intel pays money to NVidia.
Intel is paying Nvidia as part of the out of court settlement, correct? Nvidia sued Intel because Intel prohibited NVDA from making chipsets for the Intel platform, yes? So the protection money as you call it is not something Intel did willingly or by design is this fair to say? Put another way, would Intel have paid Nvidia anything resembling "protection money" if it was not for the lawsuit?

...BTW excellent post thanks for taking the time truly very informative.
 
Last edited:

rootheday3

Member
Sep 5, 2013
44
0
66
The lawsuit was about chipsets, yes. The settlement basically sets the chipset issue aside without Intel admitting any wrongdoing. The only chipset related terms if the settlement are that Intel has to offer PCIe on certain form factors for some number of years that a dgpu could attach to. Frankly, with Intel putting the memory controller in the CPU + CPU already having a GPU in it, there really wasn't a viable business for chipsets with graphics in them after Core 2 Duo/beginning with Arrandale/Clarkdale/Sandybrdige.

The ongoing payments are for licensing Graphics IP- but that is essentially a preventive measure to avoid patent lawsuits.
 

rootheday3

Member
Sep 5, 2013
44
0
66
In my earlier post I forgot one technical reason - unified memory. For heterogeneous compute , synchronizing data over PCIe is too slow and more complicated than the IGP and CPU sharing the same cache, fabric, physical memory and page tables. Broadwell already has shared memory, including cache coherency. It would take a lot of extra work to do that for dgpu as well and performance would likely be worse...

Stated differently, for fine grained heterogeneous compute, a really fast dgpu + CPU may be slower than a smaller igpu + same CPU due to the bus transfer latency and bandwidth in the latter. For reference, the on die bandwidth on Haswell ring (including LLC) is something like 200GB/s with ~30 clk latency.
 

Ryanrenesis

Member
Nov 10, 2014
156
1
0
I'm pretty shocked by the level of knowledge and quality information on this forum compared to others...*cough*tom*cough*.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Stated differently, for fine grained heterogeneous compute, a really fast dgpu + CPU may be slower than a smaller igpu + same CPU due to the bus transfer latency and bandwidth in the latter. For reference, the on die bandwidth on Haswell ring (including LLC) is something like 200GB/s with ~30 clk latency.

No matter the latency, you will not reach the performance of a high-end dGPU like GTX780Ti + Core i3 Haswell with any iGPU constrained at sub 100W TDP.

To make it clear, lets say you only need a GTX770 iGPU like + Core i7 Haswell to reach the same performance of GTX780Ti + Core i3 haswell.
When you will be able to manufacture a Core i7 + GTX770 iGPU at sub 100W TDP, the dGPUs of that time will be 4-6 times faster than current GTX780Ti.
Not only that, dGPUs of that time will also share memory with the entire system and many more features directly aiming at lowering data communication latencies etc.

I could see a 200W TDP APU be directly competitive against a 55W CPU + 250W TDP dGPU, but sub 100W APUs will not have a chance not now and not in the future (5-10 years).
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
No matter the latency, you will not reach the performance of a high-end dGPU like GTX780Ti + Core i3 Haswell with any iGPU constrained at sub 100W TDP.

To make it clear, lets say you only need a GTX770 iGPU like + Core i7 Haswell to reach the same performance of GTX780Ti + Core i3 haswell.
When you will be able to manufacture a Core i7 + GTX770 iGPU at sub 100W TDP, the dGPUs of that time will be 4-6 times faster than current GTX780Ti.
Not only that, dGPUs of that time will also share memory with the entire system and many more features directly aiming at lowering data communication latencies etc.

I could see a 200W TDP APU be directly competitive against a 55W CPU + 250W TDP dGPU, but sub 100W APUs will not have a chance not now and not in the future (5-10 years).

Nvidia's GTX 970, which is a high-end GPU, is only 145W. If you make it mid-range and you add in DirectX 12 and Intel's manufacturing advantage, certainly if it grows even further, then you have a 95W APU (with i3) that is good enough for most of the market.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Nvidia's GTX 970, which is a high-end GPU, is only 145W. If you make it mid-range and you add in DirectX 12 and Intel's manufacturing advantage, certainly if it grows even further, then you have a 95W APU (with i3) that is good enough for most of the market.

We are not talking about good enough here.