• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Was thinking of Haswell; what's keeping intel from discrete GPU now?

Unoid

Senior member
Thinking about the GT3e graphics haswell will have. a 40 execution unit that's solid and is increasing performance leaps and bounds with generation.

What is keeping Intel from designing a discrete GPU?
They have all the resources to do it.

Release a GDDR5 card with 1000EU's and call it a day!

No need for knights ferry crap.
 
You make it sound too easy.
Why for them to dedicate so many resources to invest into a market that is already fully carved up between two competitors with margins close to starvation, when Intel itself believes the future lies in on die or soldered on??

There is too little to gain
 
It's basically a step backwards. The future is integration. Why put all of those resources into dying tech? It'll just take resources from something else.
 
It's just too late in the game to enter the market at this point, the opportunity cost is large while the gains would be short lived at best. Discrete is a declining market especially when integrated parts are demanded by most consumers - and intel has been making big strides there over prior years. Also consider that computing is largely going mobile for the mass consumer market, which ties in with the decline of desktop and discrete. Therefore intel would not enter the market, profits are limited at best.

Besides which, intel has the majority of the market anyway in terms of overall graphics share with their integrated graphics.
 
Aren't they using Nvidia tech?

Maybe that would have something to do with it as well?

All the GPU makers use each other's tech. They all have cross-licensing agreements with each other as it's pretty much impossible to make a GPU without using AMD, Nvidia, Intel, and other's patents.
 
Last edited:
Too hard competition. AMD/Nvidia would slaughter any attempts with power and price margins

Intel is a node shrink ahead of AMD and Nvidia, they actually have some room for error if they wanted to jump into the GPU market, as their GPU wouldn't even have to be as good from a technological standpoint starting out, since they could just pack more transistors into the same die space.
 
Intel aren't interested in dGPUs at all. They want to steal that part of the BoM back from the graphics card providers! They want to provide a range of APUs which will satisfy anyone, so if you want high performance graphics you buy the Core i7 with a HD 9000 (or whatever it's called by then).
 
Too hard competition. AMD/Nvidia would slaughter any attempts with power and price margins

Nope. The difference is intel actually owns many fabs, which more advanced tech than TSMC or glofo.

IF intel bothered to push into the discrete GPU market, they would devastate NV and AMD from the consumer space, and push NV into the HPC-only sector..

Ppl need to give intel credit, they are always ahead of the game when it comes to semiconductors, they only lack the direction to head this way.
 
It's basically a step backwards. The future is integration. Why put all of those resources into dying tech? It'll just take resources from something else.

There's always a need for discrete, because there's always a small population of gamers who want to push their graphics to the max. You simply cannot do that with iGPU given die space and thermal limits.
 
Nope. The difference is intel actually owns many fabs, which more advanced tech than TSMC or glofo.

IF intel bothered to push into the discrete GPU market, they would devastate NV and AMD from the consumer space, and push NV into the HPC-only sector..

Ppl need to give intel credit, they are always ahead of the game when it comes to semiconductors, they only lack the direction to head this way.

I think you're being a little too optimistic. Just because they have fabs with better tech, doesn't mean they have designs to print using those fabs.

AMD and NV both have two decades of designs to work on, and the engineers who have that experience.

Unless Intel starts poaching engineers from AMD/NV, or have been secretly working on large GPUs in a black R&D lab, it won't be cost effective to jump into this game.

Not to mention, just because gamers want a discrete GPU, doesn't mean we'll get one. If it comes the day that this market is no longer profitable, then we'll just go the way of the dodo.
 
There is a chance in the next 10 years the discrete GPU market will just be gone. Every one will need some amount of graphics capability but once the integrated parts get good enough as more and more transistors and capabilities find themselves onto the CPU we may well find that discrete cards don't offer much advantage.

Intel could build a discrete card, they have certainly tried all be it via a somewhat odd direction. But Intel doesn't just lean on its fabrication technology, it also leans on its IP with x86 and combining the two does take them down a route that none of their competitors can really go. Graphics is a special mass market case of highly parallel computation depending on a lot of floating point and various fused instructions and some quite specific computations. Its certainly possible to write a graphics pipeline as a CPU program, given enough compute resources it should be possible to create a generic version of a GPU that has a lot more applications. If you have ever tried openCL/Cuda programming I can tell you its not pretty, its very specific and you have to change your software a lot to make it work.

I think Intel is on the right long term path, integrated for the usual desktop world and the future of mass parallel with their current Phi research projects, both of which have the potential to completely change the discrete GPU market completely. They don't just want to compete in the market, they are attempting to put themselves years ahead of any potential competition.
 
There is a chance in the next 10 years the discrete GPU market will just be gone. Every one will need some amount of graphics capability but once the integrated parts get good enough as more and more transistors and capabilities find themselves onto the CPU we may well find that discrete cards don't offer much advantage.

Intel could build a discrete card, they have certainly tried all be it via a somewhat odd direction. But Intel doesn't just lean on its fabrication technology, it also leans on its IP with x86 and combining the two does take them down a route that none of their competitors can really go. Graphics is a special mass market case of highly parallel computation depending on a lot of floating point and various fused instructions and some quite specific computations. Its certainly possible to write a graphics pipeline as a CPU program, given enough compute resources it should be possible to create a generic version of a GPU that has a lot more applications. If you have ever tried openCL/Cuda programming I can tell you its not pretty, its very specific and you have to change your software a lot to make it work.

I think Intel is on the right long term path, integrated for the usual desktop world and the future of mass parallel with their current Phi research projects, both of which have the potential to completely change the discrete GPU market completely. They don't just want to compete in the market, they are attempting to put themselves years ahead of any potential competition.

Thanks for the informative IDC-esque post 🙂

I'm thinking of how intel could better maximize their fabs, instead of running them at 50%.

They have entire GPU's albeit attached to northbridges or integrated in CPU form. On a simplistic approach knowing what they have done and can do, one would assume Intel could just SCALE the die size of GT3 on par with nvidia and AMD, and hopefully it would be close to competing (drivers aside)

Discrete GPU's aren't going anywhere, especially when we see a push for more ray tracing which will require 300Watt powerhouses.
 
Intel can see the writing on the wall that dGPU's are a dying market. The PS4 is an indicator of things to come. A SOC capable of next gen gaming.
 
IMO, Intel uses GPUs as a way to sell their CPUs.

Not only are they not interested in GPUs for the sake of GPUs, but a discrete GPU would actually go against their intentions.

We might see a GPGPU specific chip from them sometime, but I am personally doubtful.
 
Besides what has been said also look how much people complain about drivers from AMD/Nvida. Intels drivers are much worse. So take awful drivers and try and support a high end card/chip line where people will test every inch of your hardware and drivers.

If Intel can;t even get their basic drivers inline what do you think will happen if they have to support even more.
 
Back
Top