Was thinking of Haswell; what's keeping intel from discrete GPU now?

Unoid

Senior member
Dec 20, 2012
461
0
76
Thinking about the GT3e graphics haswell will have. a 40 execution unit that's solid and is increasing performance leaps and bounds with generation.

What is keeping Intel from designing a discrete GPU?
They have all the resources to do it.

Release a GDDR5 card with 1000EU's and call it a day!

No need for knights ferry crap.
 

el-Capitan

Senior member
Apr 24, 2012
572
2
81
You make it sound too easy.
Why for them to dedicate so many resources to invest into a market that is already fully carved up between two competitors with margins close to starvation, when Intel itself believes the future lies in on die or soldered on??

There is too little to gain
 

alcoholbob

Diamond Member
May 24, 2005
6,390
469
126
Probably afraid it will either bomb or margins aren't worth it after Larrabee disaster.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
It's basically a step backwards. The future is integration. Why put all of those resources into dying tech? It'll just take resources from something else.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
It's just too late in the game to enter the market at this point, the opportunity cost is large while the gains would be short lived at best. Discrete is a declining market especially when integrated parts are demanded by most consumers - and intel has been making big strides there over prior years. Also consider that computing is largely going mobile for the mass consumer market, which ties in with the decline of desktop and discrete. Therefore intel would not enter the market, profits are limited at best.

Besides which, intel has the majority of the market anyway in terms of overall graphics share with their integrated graphics.
 

(sic)Klown12

Senior member
Nov 27, 2010
572
0
76
Aren't they using Nvidia tech?

Maybe that would have something to do with it as well?

All the GPU makers use each other's tech. They all have cross-licensing agreements with each other as it's pretty much impossible to make a GPU without using AMD, Nvidia, Intel, and other's patents.
 
Last edited:

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
Too hard competition. AMD/Nvidia would slaughter any attempts with power and price margins

Intel is a node shrink ahead of AMD and Nvidia, they actually have some room for error if they wanted to jump into the GPU market, as their GPU wouldn't even have to be as good from a technological standpoint starting out, since they could just pack more transistors into the same die space.
 

NTMBK

Lifer
Nov 14, 2011
10,466
5,852
136
Intel aren't interested in dGPUs at all. They want to steal that part of the BoM back from the graphics card providers! They want to provide a range of APUs which will satisfy anyone, so if you want high performance graphics you buy the Core i7 with a HD 9000 (or whatever it's called by then).
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Not worth it to them. If it were, they would do it. They already own like 60-65% of the total GPU market.
 
Feb 19, 2009
10,457
10
76
Too hard competition. AMD/Nvidia would slaughter any attempts with power and price margins

Nope. The difference is intel actually owns many fabs, which more advanced tech than TSMC or glofo.

IF intel bothered to push into the discrete GPU market, they would devastate NV and AMD from the consumer space, and push NV into the HPC-only sector..

Ppl need to give intel credit, they are always ahead of the game when it comes to semiconductors, they only lack the direction to head this way.
 
Feb 19, 2009
10,457
10
76
It's basically a step backwards. The future is integration. Why put all of those resources into dying tech? It'll just take resources from something else.

There's always a need for discrete, because there's always a small population of gamers who want to push their graphics to the max. You simply cannot do that with iGPU given die space and thermal limits.
 

Eureka

Diamond Member
Sep 6, 2005
3,822
1
81
Nope. The difference is intel actually owns many fabs, which more advanced tech than TSMC or glofo.

IF intel bothered to push into the discrete GPU market, they would devastate NV and AMD from the consumer space, and push NV into the HPC-only sector..

Ppl need to give intel credit, they are always ahead of the game when it comes to semiconductors, they only lack the direction to head this way.

I think you're being a little too optimistic. Just because they have fabs with better tech, doesn't mean they have designs to print using those fabs.

AMD and NV both have two decades of designs to work on, and the engineers who have that experience.

Unless Intel starts poaching engineers from AMD/NV, or have been secretly working on large GPUs in a black R&D lab, it won't be cost effective to jump into this game.

Not to mention, just because gamers want a discrete GPU, doesn't mean we'll get one. If it comes the day that this market is no longer profitable, then we'll just go the way of the dodo.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
There is a chance in the next 10 years the discrete GPU market will just be gone. Every one will need some amount of graphics capability but once the integrated parts get good enough as more and more transistors and capabilities find themselves onto the CPU we may well find that discrete cards don't offer much advantage.

Intel could build a discrete card, they have certainly tried all be it via a somewhat odd direction. But Intel doesn't just lean on its fabrication technology, it also leans on its IP with x86 and combining the two does take them down a route that none of their competitors can really go. Graphics is a special mass market case of highly parallel computation depending on a lot of floating point and various fused instructions and some quite specific computations. Its certainly possible to write a graphics pipeline as a CPU program, given enough compute resources it should be possible to create a generic version of a GPU that has a lot more applications. If you have ever tried openCL/Cuda programming I can tell you its not pretty, its very specific and you have to change your software a lot to make it work.

I think Intel is on the right long term path, integrated for the usual desktop world and the future of mass parallel with their current Phi research projects, both of which have the potential to completely change the discrete GPU market completely. They don't just want to compete in the market, they are attempting to put themselves years ahead of any potential competition.
 

Unoid

Senior member
Dec 20, 2012
461
0
76
There is a chance in the next 10 years the discrete GPU market will just be gone. Every one will need some amount of graphics capability but once the integrated parts get good enough as more and more transistors and capabilities find themselves onto the CPU we may well find that discrete cards don't offer much advantage.

Intel could build a discrete card, they have certainly tried all be it via a somewhat odd direction. But Intel doesn't just lean on its fabrication technology, it also leans on its IP with x86 and combining the two does take them down a route that none of their competitors can really go. Graphics is a special mass market case of highly parallel computation depending on a lot of floating point and various fused instructions and some quite specific computations. Its certainly possible to write a graphics pipeline as a CPU program, given enough compute resources it should be possible to create a generic version of a GPU that has a lot more applications. If you have ever tried openCL/Cuda programming I can tell you its not pretty, its very specific and you have to change your software a lot to make it work.

I think Intel is on the right long term path, integrated for the usual desktop world and the future of mass parallel with their current Phi research projects, both of which have the potential to completely change the discrete GPU market completely. They don't just want to compete in the market, they are attempting to put themselves years ahead of any potential competition.

Thanks for the informative IDC-esque post :)

I'm thinking of how intel could better maximize their fabs, instead of running them at 50%.

They have entire GPU's albeit attached to northbridges or integrated in CPU form. On a simplistic approach knowing what they have done and can do, one would assume Intel could just SCALE the die size of GT3 on par with nvidia and AMD, and hopefully it would be close to competing (drivers aside)

Discrete GPU's aren't going anywhere, especially when we see a push for more ray tracing which will require 300Watt powerhouses.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Intel can see the writing on the wall that dGPU's are a dying market. The PS4 is an indicator of things to come. A SOC capable of next gen gaming.
 

ghost03

Senior member
Jul 26, 2004
372
0
76
IMO, Intel uses GPUs as a way to sell their CPUs.

Not only are they not interested in GPUs for the sake of GPUs, but a discrete GPU would actually go against their intentions.

We might see a GPGPU specific chip from them sometime, but I am personally doubtful.
 

Jimzz

Diamond Member
Oct 23, 2012
4,399
190
106
Besides what has been said also look how much people complain about drivers from AMD/Nvida. Intels drivers are much worse. So take awful drivers and try and support a high end card/chip line where people will test every inch of your hardware and drivers.

If Intel can;t even get their basic drivers inline what do you think will happen if they have to support even more.