Was thinking of Haswell; what's keeping intel from discrete GPU now?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

el-Capitan

Senior member
Apr 24, 2012
572
2
81
sinple answer:

Vertically integrated GPU manufacturer that has their own FABs = higher margins than nvidia or AMD can compete with.

You need to do more thinking. Just having fabs doesn't mean profit - its not like you flick a switch and you go from Pentiums to GPUs. See AMD who has their own fabs (or did they sell them all by now?).
 

Eureka

Diamond Member
Sep 6, 2005
3,822
1
81
On a related note, if Intel isn't using up all of their fab capacity, and they have access to better technology, why aren't AMD and Nvidia using Intel fabs, especially with troubles at GF and TSMC?
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Nope. The difference is intel actually owns many fabs, which more advanced tech than TSMC or glofo.

IF intel bothered to push into the discrete GPU market, they would devastate NV and AMD from the consumer space, and push NV into the HPC-only sector..

Ppl need to give intel credit, they are always ahead of the game when it comes to semiconductors, they only lack the direction to head this way.

Yeah right. Since when had Intel expertise in building graphic cards? Designs come first, fabs come second.
 

Unoid

Senior member
Dec 20, 2012
461
0
76
On a related note, if Intel isn't using up all of their fab capacity, and they have access to better technology, why aren't AMD and Nvidia using Intel fabs, especially with troubles at GF and TSMC?

Intel is already opening their fabs to third parties. they're in the early stages of ramping it up but it's happening. I kind of doubt intel will produce AMD or nvidia chips.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
sinple answer:

Vertically integrated GPU manufacturer that has their own FABs = higher margins than nvidia or AMD can compete with.

I mean no offense by this, but you're nuts if you think intel wants to enter a highly declining market. Discrete desktop has declined several years in a row now because of mobile. Mobile is where consumers are. Intel goes where consumers are. Yes, I realize that very few outlier enthusiasts that spend 4000$ on desktops are the exception, that ISNT the norm. Anyway, discrete requires significant R+D outlay. Sorry, as much as it pains both of us discrete will be dead in several years. It could be 5 of 10, but let's make no mistake it's a declining market and intel will not enter it - Mobile is where consumers are going, not desktop. You can argue enterprise, but they would still get slaughtered as a newcomer.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Yeah right. Since when had Intel expertise in building graphic cards? Designs come first, fabs come second.

Yeah really. These guys act like it's a simple act to create a new wondrous GPU just because of FABs, give me a break. R+D costs are significant, it takes 3-4 years for a product to hit the market after design. Anyway, if FABs were the sole cause of success, there would be a million copycat GPU makers.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
On a related note, if Intel isn't using up all of their fab capacity, and they have access to better technology, why aren't AMD and Nvidia using Intel fabs, especially with troubles at GF and TSMC?

The answer is obvious isn't it? I remember an intel exec was asked this question in an interview (i'll try to find it) and he laughed at the mere thought. The question was related to whether nvidia could ever use intel fabs. This was after nvidia's CEO said he wanted to use intel's fabs. He laughed and mocked the interviewer and (short version) basically stated "we don't enable competitors, that won't happen".

Think about that for a moment. Intel will never allow it. Intel wants to make mobile chips. Nvidia makes mobile chips. Why on earth would intel ever give nvidia an advantage by allowing them to use a 14nm process for their ARM chips? Intel wants that for themselves. In other words, intel won't enable competitors. That's why those companies rely on TSMC and global foundries, because they (TSMC and global foundries) don't design chips, they're just FABs.. Therefore there is no conflict of interest. Intel doing the same would be a huge conflict of interest and would basically enable a competitor to better compete AGAINST Them. It's basically shooting themselves in the foot - if intel allowed that, nvidia would have 14nm ARM chips on the market, and that is a market that intel is seeking to dominate.

Short version: Huge conflict of interest and enabling competitors makes it a no-go.
 
Last edited:
Feb 19, 2009
10,457
10
76
I think you're being a little too optimistic. Just because they have fabs with better tech, doesn't mean they have designs to print using those fabs.

AMD and NV both have two decades of designs to work on, and the engineers who have that experience.

Unless Intel starts poaching engineers from AMD/NV, or have been secretly working on large GPUs in a black R&D lab, it won't be cost effective to jump into this game.

Not to mention, just because gamers want a discrete GPU, doesn't mean we'll get one. If it comes the day that this market is no longer profitable, then we'll just go the way of the dodo.

I dont see future gamers demanding less graphics, its just not the trend. Every new game almost always is expected to WOW more in the gfx category. With more gamers going higher res, and the trend is for more pixels per inch etc...
 

desura

Diamond Member
Mar 22, 2013
4,627
129
101
I recall back in the 1990's Intel did come out with a discrete GFX card to compete against like the Voodoo2. 740 something.

They must have had a good reason to not bother. I mean, we have GPU cards that sell for 1k now, while the highest discrete intel processor goes for like $300, right?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I dont see future gamers demanding less graphics, its just not the trend. Every new game almost always is expected to WOW more in the gfx category. With more gamers going higher res, and the trend is for more pixels per inch etc...

The problem is, there are fewers gateways to PC gaming for new customers. Existing PC gamers are well-established and loyal, but new consumers aren't buying desktop PCs nearly as much these days. I still think discrete has maybe 5-10 years of life, but we'd have to be kidding ourselves to think it's a good long term bet.

I mean, back in our day - we started with a junk Dell computer and slowly upgraded it to be used for gaming. Then we moved on to build our own systems. Nowadays, people don't even start with a PC as a "gateway". They'll just get an ipad or tablet, if that makes sense. Therefore most of these folks will never be PC gamers and hence not be in the market for a future discrete card.
 

ghost03

Senior member
Jul 26, 2004
372
0
76
I still think discrete has maybe 5-10 years of life, but we'd have to be kidding ourselves to think it's a good long term bet.

I see where you're coming from, but this has been said constantly since the very first 3d cards 20 years ago (which, interestingly, often still required a 2D card to be in the system!)

Yes, it's declining, but that decline will likely taper off to some constant level of demand. I wouldn't expect yearly updates with huge performance benefits long term, but I expect discrete cards to be around for some time in one form or another.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I see where you're coming from, but this has been said constantly since the very first 3d cards 20 years ago (which, interestingly, often still required a 2D card to be in the system!)

Yes, it's declining, but that decline will likely taper off to some constant level of demand. I wouldn't expect yearly updates with huge performance benefits long term, but I expect discrete cards to be around for some time in one form or another.

No, it really wasn't said, I don't agree with you. Some derivatives of PC gaming dying have been stated due to consoles, but discrete was alive and kicking because desktops were alive and kicking. Discrete sales did very well because every new customer had to buy a desktop for their gateway computing desktop.

That just isn't the case now for most new consumers...

New customers were actually BUYING desktops 10 years ago. Now they aren't. They're buying tablets - and it's easy to see why. The typical computing experience consists of media consumption, something that you hardly need a 4000$ PC for. I don't agree with you at all. I do think discrete has life left because PC gamers tend to be loyalists and return customers, though - but it just isn't attracting as many new customers for a gateway device.
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
No, it really wasn't said, I don't agree with you. Some derivatives of PC gaming dying have been stated due to consoles, but discrete was alive and kicking because desktops were alive and kicking. Discrete sales did very well because every new customer had to buy a desktop for their gateway computing desktop.

That just isn't the case now for most new consumers...

New customers were actually BUYING desktops 10 years ago. Now they aren't. They're buying tablets - and it's easy to see why. The typical computing experience consists of media consumption, something that you hardly need a 4000$ PC for. I don't agree with you at all. I do think discrete has life left because PC gamers tend to be loyalists and return customers, though - but it just isn't attracting as many new customers for a gateway device.

Those buyers were never PC gamers. The overall PC market is declining and discrete graphics cards are no longer required for basic use, but PC gaming is growing.