Why can't Intel compete with AMD in GPU Sector?

john5220

Senior member
Mar 27, 2014
551
0
0
Why exactly? this don't add up we are talking about the biggest chip giant in the world here. AMD is much more handicapped than they are, yet AMD runs the show in GPU sector infact not long ago they had Nvidia beat pretty bad with GTX 400 and 500 era, those chips were just too hot and power hungry. Sometimes AMD in GPU is like the intel of CPU so to speak.

Intel has a LOT of money but they cannot make a video card? how come? apart from AMD always beating them in iGPU, they intel also has no video card to this day.

I was hoping to hear something about their labree project once more I taught it was going to be a 3rd player in the GPU market but I was wrong.

I heard of it once and never again. Why did they give up hones on video cards?

Do you all think if intel invests all their money they can make video cards like the R9 290X etc?

INTC_KnightsFerry_Board2_68.jpg
 
Last edited:

Ryanrenesis

Member
Nov 10, 2014
156
1
0
Intel barely needs to invest "all their money" on videos card to compete with AMD. It probably involves start-up costs like additional infrastructure being too high atm, and additional management required, which will spread their domination and focus on CPU too thinly.

Still though, Intel has so much money to spend and so much assets to acquire debt financing that I just don't believe Intel doesn't have the capital to invest in the GPU market.

There must be other reasons.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
The Larrabee project changed it's focus to general purpose SPMD performance when it was pretty clear to Intel that it wouldn't have been competitive in the graphics segment in terms of performance. The reason that graphics performance wasn't competitive in the first place was because it lacked fixed function units like triangle setup engines, tessellation units, rasterizers, and blending units to accelerate some parts of the graphics pipeline like a conventional GPU would have.

It's successor, Intel MIC inherits many traits of Larrabee such as it's 512-bit wide SIMD unit and the Coherent L2 cache.
 

john5220

Senior member
Mar 27, 2014
551
0
0
I personally feel that intel has the capability to do this, might cost some money to do so but they have it, if you look at the Iris Pro its a pretty bad ass GPU.

Imagine beefing it up and going in desktop setup with power and heat being less of an issue? I bet they could do something similar to a GTX 750 ti.

A GTX 980 and R9 295X2 I am not so sure about. But this is big business look how rich Nvidia is at this. It makes no sense for intel to sit idle by, AMD is much poorer and handicapped than they are yet AMD pulls off this so easily?
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
I personally feel that intel has the capability to do this, might cost some money to do so but they have it, if you look at the Iris Pro its a pretty bad ass GPU.

Imagine beefing it up and going in desktop setup with power and heat being less of an issue? I bet they could do something similar to a GTX 750 ti.

A GTX 980 and R9 295X2 I am not so sure about. But this is big business look how rich Nvidia is at this. It makes no sense for intel to sit idle by, AMD is much poorer and handicapped than they are yet AMD pulls off this so easily?

It's not about having more capability to execute ...

It's a question of TRUST! A lot of consumers aren't willing to give Intel the time of day when it comes to a discrete GPU.

How do you propose that Intel can gain a solid foothold in the discrete graphics market when they have no following ?
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Plus, Intel's IGP / GPU drivers are legendary... for poor quality and support. (No Haswell IGP drivers for XP, for example.)
 

john5220

Senior member
Mar 27, 2014
551
0
0
^^ But intel has lots of fans who swear by the i5 and i7 I have seen people on this forum swear by anything intel to the very end.

I would have assumed this would be enough to wooo in fanboys etc?

I am not sure what intel is doing about Qualcomm's massive foothole in smartphones that company now possesses half the total money of Intel.
I think they did a piss poor Job on atoms in smartphones and tablets they had the chance but no idea what they were doing.

I'd imagine we will see discrete GPU from intel in the coming years, it would most likely stem from their experience with Iris Pro etc. You know Iris Pro is actually a brilliant invention that runs on super slim retina display macbook pros and packs a good punch, yet you could NEVER get a i7 for desktop to buy with that. And then when you consider someone buying a i7 why would they even use the iGPU in a desktop? iris pro seems so useless for desktops.
 
Last edited:
Aug 11, 2008
10,451
642
126
Discrete gpus are a declining market as integrated graphics becomes stronger. I dont accept the death of the dgpu like some here propose, but for sure it is not a growing market. Considering there are 2 already strongly estabished makers, I think they want to focus resources elsewhere.
 

SunburstLP

Member
Jun 15, 2014
86
20
81
I'll never forgive Intel for the i740, but their linux drivers are really good.

If Intel had a wild hair up their butt one day and decided to, I'm fully confident that they could not only afford the talent, but they could also afford the associated R&D. I'm not sure how to feel about that, tbh.
 

mikk

Diamond Member
May 15, 2012
4,308
2,395
136
Plus, Intel's IGP / GPU drivers are legendary...for poor quality and support.

I doubt you are aware of Intels driver status.

(No Haswell IGP drivers for XP, for example.)

This is the worst example you could bring up.


They don't have ambitions to compete in the discrete market. When we talk about integrated igpus I expect that even Intels desktop GT2 for Skylake is able to compete with the fastest desktop Kaveri. Their Gen7 is 2.5 years old technology with flaws. Haswell is based on this. For desktop Skylake-S will bring the first real GPU update since 2012 Ivy Bridge.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Broadwell K the next big iGPU update isn't it? It looked at one point at least like that might be the iris pro/i7 combination sort of desktop chip referenced below.

Not that anything around Broadwell is clear right now :)

btw, 750ti style performance is surely a very conservative target for iGPU performance. Once HBM comes in the memory bandwidth will be much less of an issue, so 'just' limited by power draw/space.

Ok, they're likely not as efficient as Maxwell base architecture, but process difference should make a 60w equivalent iGPU very comfortable. Even the power draw of the 970/980 aren't in an insane sort of range.
(Imagine that AMD are more likely to try really pushing the limits than Intel.).
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
A few comments from an Intel CPU architect:

http://www.reddit.com/r/IAmA/comments/15iaet/iama_cpu_architect_and_designer_at_intel_ama/c7mqkld

http://www.reddit.com/r/IAmA/comments/15iaet/iama_cpu_architect_and_designer_at_intel_ama/c7mpdki:

On-die, are you willing to pay for the die area? I suggest you look the perf/mm2 and perf/W of our Gen graphics. We're working very hard to improve Windows and Linux drivers to compliment the hardware. If you're expecting discrete graphics, then you'll be disappointed.

http://www.reddit.com/r/IAmA/comments/15iaet/iama_cpu_architect_and_designer_at_intel_ama/c7mpg8v
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
They don't have ambitions to compete in the discrete market. When we talk about integrated igpus I expect that even Intels desktop GT2 for Skylake is able to compete with the fastest desktop Kaveri. Their Gen7 is 2.5 years old technology with flaws. Haswell is based on this. For desktop Skylake-S will bring the first real GPU update since 2012 Ivy Bridge.

Kaveris iGPU tech is 3 years old GCN, not to mention that Kaveri will be more than one year old product when Skylake will be released. Skylake will have to compete against Carrizo. And something tells me Skylake GT2 will have hard time competing against Carrizo. And if that turns to be true, then Intels iGPU tech will look even more pathetic because Carrizo will still be a 28nm Product.
 

nenforcer

Golden Member
Aug 26, 2008
1,779
20
81
Intel doesn't have to compete with either ATI (AMD) or Nvidia in the discrete graphics sector because they already dominate the total marketshare of iGPU with something like a 40% marketshare and possibly more.

The vast majority of desktop systems (think businesses) don't ship with discrete graphics which is where Intel is strongest.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Intel doesn't have to compete with either ATI (AMD) or Nvidia in the discrete graphics sector because they already dominate the total marketshare of iGPU with something like a 40% marketshare and possibly more.

The vast majority of desktop systems (think businesses) don't ship with discrete graphics which is where Intel is strongest.

more like 70%
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Why exactly? this don't add up we are talking about the biggest chip giant in the world here. AMD is much more handicapped than they are, yet AMD runs the show in GPU sector infact not long ago they had Nvidia beat pretty bad with GTX 400 and 500 era, those chips were just too hot and power hungry. Sometimes AMD in GPU is like the intel of CPU so to speak.

The history of the dGPU market (PC, professional and HPC) is AMD and Nvidia fighting over the performance crown but Nvidia laughing all the way to the bank after the profits. AMD never could make significant amount of money from its GPU business. Nvidia will sell this year around 3.5 billion in GPUs, and end up with roughly 1 billion in terms of operating profits from this business. Is it worth for Intel to invest to break in a market that is largely constant in terms of revenue since 2008, or should they try to focus on a potentially high growth part (HPC) while salting the earth for the other two players by increasing the die share and performance of their iGPU?

I particularly thought that once Intel achieve a certain performance threshold in their iGPU efforts they would launch a dGPU, because it will be marginal costs for them to do so, but given the latest developments (HMC), they might just beef up their iGPU and integration between CPU and GPU and reduce the dGPU to a very small niche market.

Don't get me wrong, Nvidia will still make plenty of money until that happen, and they are developing a SoC themselves that should make use of stacked memory, this is just the calculus Intel should be making to whether to enter or not to enter the business.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
There are 2 reasons.

1. dGPU is a dying breed. The JPR numbers speak for themselves. Intel now sits on 71.8% of the graphics market and still grows rapidly.
2. ASP on dGPUs are too low. The only place with high ASP is the professional/HPC market. And the Xeon Phi is fantastic in HPC.

nVidia is really the only graphics company making money.
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
Intel can compete with AMD in the GPU sector, but they do not need to have the better GPU. They already have a good(good, not only decent) iGPU.


I personally think that Intel spending much R&D resources into compete with AMD on the iGPU is waste of money. They already have competitive performance with their GPUs, and the dGPU market is a bit little for them to enter.
 

mikk

Diamond Member
May 15, 2012
4,308
2,395
136
Kaveris iGPU tech is 3 years old GCN, not to mention that Kaveri will be more than one year old product when Skylake will be released. Skylake will have to compete against Carrizo. And something tells me Skylake GT2 will have hard time competing against Carrizo. And if that turns to be true, then Intels iGPU tech will look even more pathetic because Carrizo will still be a 28nm Product.


For an APU it's pretty new. You can't compare a CPU+integrated GPU and a whole new platform with a dedicated standalone GPU because validation times are much longer. There is a reason why AMDs APU is lagging behind their standalone GPUs. I doubt Carrizo will bring a big improvement over Kaveri btw. Desktop existence is still unclear as well.
 

john5220

Senior member
Mar 27, 2014
551
0
0
dGPU is a dying breed?

Its shifting to high pixel density displays now ever since the launch of the original "retina display"

have anyone seen how a dGPU runs on 4K with high Anti Aliasing? it can't even pull off 1 FPS proper. Even the GTX Titan which is so advanced can't even pull off that unless we are talking 1400W power supply and 3 Way SLI

It would probably take something like 70 years for iGPU to catch up with the power of 3 way SLI GTX Titan combo.

I don't think any serious gamer is going to be buying iGPU to run in 4K with AA
 

rtsurfer

Senior member
Oct 14, 2013
733
15
76
For an APU it's pretty new. You can't compare a CPU+integrated GPU and a whole new platform with a dedicated standalone GPU because validation times are much longer. There is a reason why AMDs APU is lagging behind their standalone GPUs. I doubt Carrizo will bring a big improvement over Kaveri btw. Desktop existence is still unclear as well.

So your whole argument is that Intel can improve & AMD can't.?
And what is this whole new platform with a dedicated GPU, you speak of?
And what do you mean Desktop existence is unclear.?
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I'm going to try and cover every single reason OP and I'll let the more technical people elaborate on them.

First though, it wasn't AMD Intel NVidia. Since we're going to go back in time and talk about something like Larrabee you have to realize it was AMD vs Intel for CPUs and Nvidia vs ATI for CPUs.

In 2006, AMD acquires ATI with a new idea for the APU! Great integrated graphics performance.
Before this, AMD had just released it's most competitive architecture yet that swept the floor with the Netburst Pentium 4s.
In 2006 AMD acquires ATI and focuses on graphics performance.
At the same time, Intel focuses on CPU performance and releases Conroe.

Subsequently, we got one of the most revolutionary processors that destroyed AMD as AMD's focus was on Graphics (HUGE MISTAKE). Since then AMD has never recovered or caught up.

Now ask yourself, why would intel, seeing how in the past, a CPU maker who focuses on graphics dug it's own grave, why would they follow that same path?

Now lets move on to actual performance. Does performance matter when selling a product? In some aspects yes it does but mostly it doesn't. Just ask intel again, they sold the Pentium 4 and had I believe to be 75% of the market at the time with a HORRENDOUS architecture. AMD had around 25% I believe with a far better performing processor. Did it matter? No.

Now lets fast forward to today and look at AMD vs Nvidia.
To get some quick dirty stats here (not representative of the whole just of gamers)
http://store.steampowered.com/hwsurvey/videocard/
AMD has 30%, Nvidia 50% and Intel 20%. Yes, 20% among GAMERS (how can you even game on an igpu?). So despite AMD having a VASTLY superior product to Intel for GPUs, they are only 10% ahead.
But the real question is why is Nvidia at 50% of the market? Simple, name brand recognition. Plain and simple, AMD has released better GPUs than Nvidia multiple times and yet Nvidia almost ALWAYS outsells them. Not even that they outsell them, it's that Nvidia can charge more money for the same performance than AMD. The only way for AMD to compete with Nvidia is to compete on price to performance. Consumers don't purchase based off graphs on a hardware site (Very few anyway), they purchase based off name brand recognition.

Now lets jump to Intel's move into the mobile market, specifically tablets. AtenRa loves to laugh at Intel's Contra Revenue scheme but it's the ONLY way intel can get into the market. Even though intel does have a competititve processor for tablets, they almost have to give it away for free to be able to be used. Why? Because even when the consumer is a business, they don't want an unknown. They want something that is known and will pick a Snapdragon Processor 95/100 over Intel's Baytrail, Clovertrail, etc. because they know it.

Conclusion:
Intel focusing on GPU would be a massive mistake.
Cost: It would take a TON of cash to get the facilities, research, and marketing to bring a GPU to market for intel.
Benefit: The Revenue AMD makes from GPU's is a drop in the water compared to the ~$15 Billion in revenue per QUARTER intel makes on CPUs, hence why they focus their efforts their (Nvidia who sells more GPUs than AMD hits around $1 billion a quarter in total revenue for reference).
Performance: Even if you have the best performing product, brand loyalty matters. Just look at the number of users in the VC forum who upgrade YEARLY to the new SLI cards. Nvidia sold a single card for $1000 no issues... have AMD or even Intel try that.

Etc....

In short, not worth.....

Edit: 2013, Intel pulled in 52 billion in revenue. AMD pulled in 5 billion in Revenue. Focus all their money for a market that isn't even 10% of their revenue? No thanks.....
 
Last edited: