What if Intel sells its future desktop microprocessors, ONLY with an IGP included?

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,226
126
Just thinking about how Intel could conceivably corner the market on graphics. What if they ONLY released their newer CPUs with an IGP integrated, thus cutting AMD/ATI and NV out of the market totally?

Of course, there would be a "larrabee port" on the CPU, to support a larrabee co-proc. But none from AMD/ATI or NV, of course.

Just like Intel's splitting of the CPU socket into high-end and low-end, and choosing to use a proprietary system interface for both, that locks out competitors, what is to say that they won't finally do that with graphics too?

 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
How about removing the improperly used apostrophe from your title?

;)

Sort of a related question - since Larrabee is supposed to be x86 could you potentially use one of those GPU cards to accelerate the whole system? In other words, the GPU functions as another CPU when you're not pushing heavy graphics through it. Sound possible?
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
ATI doesn't produce Intel chipsets anymore, so it is pretty much just nvidia that will suffer. I think the nvidia chipset division could easily die out. THey have already lost the AMD market to superior offerings from AMD, and if Intel cuts them off then they wo't have anywhere to go, except maybe ARM.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,126
3,653
126
Originally posted by: VirtualLarry
Just thinking about how Intel could conceivably corner the market on graphics. What if they ONLY released their newer CPUs with an IGP integrated, thus cutting AMD/ATI and NV out of the market totally?

no...

people like me will disable the IGP in bios, and use a dedicated video controller regardless.

Because i highly doubt something that small can = the same processing power of a dedicated video card.

I bet u the IGP will be okey... probably on par with a mid level videocard even...
But there people here who like to play games in resolution larger then 1650x1050 with all settings max'd.
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
integrating the IGP has nothing to do with removing PCIe. it may cut 3rd party integrated graphics out of the market, but intel has pretty much already done that. nvidia won't be making nehalem chipsets at all, so it's not like they "lost" their shot at the IGP since they never had a shot at a chipset. if you want a lower power geforce with your westmere there is no problem getting a PCIe 9400 GS, or a radeon 4350 to do the same job. of course, it's silly to do that since you have a GMA5500 already hooked up. IGPs were never meant as replacements to the discrete super GPUs, and moving them from the northbridge to the CPU doesn't change this. even the tiny larrabee controllers that'll be integrated into haswell won't come near the performance of a full sized larrabee or radeon. those chips will be ~4 billion xtors by that time, with their own power line, their own memory bus, their own card, just like the fast cards have today.
 

Cogman

Lifer
Sep 19, 2000
10,286
147
106
Originally posted by: aigomorla
Originally posted by: VirtualLarry
Just thinking about how Intel could conceivably corner the market on graphics. What if they ONLY released their newer CPUs with an IGP integrated, thus cutting AMD/ATI and NV out of the market totally?

no...

people like me will disable the IGP in bios, and use a dedicated video controller regardless.

Because i highly doubt something that small can = the same processing power of a dedicated video card.

I bet u the IGP will be okey... probably on par with a mid level videocard even...
But there people here who like to play games in resolution larger then 1650x1050 with all settings max'd.

I agree. I doubt that it will have the performance of a dedicated card. It just doesn't have enough die space. (and the TDP alone is going to be higher then anyone wants it to be.)

This looks to me to be specifically targeted at the mobile space, and low-end desktop space. Not the high end desktops.
 

imported_Lothar

Diamond Member
Aug 10, 2006
4,559
1
0
Originally posted by: drizek
ATI doesn't produce Intel chipsets anymore, so it is pretty much just nvidia that will suffer. I think the nvidia chipset division could easily die out. THey have already lost the AMD market to superior offerings from AMD, and if Intel cuts them off then they wo't have anywhere to go, except maybe ARM.

They're pretty much already dead.
Intel still hasn't given then the Nehalem license.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Originally posted by: Lothar
Originally posted by: drizek
ATI doesn't produce Intel chipsets anymore, so it is pretty much just nvidia that will suffer. I think the nvidia chipset division could easily die out. THey have already lost the AMD market to superior offerings from AMD, and if Intel cuts them off then they wo't have anywhere to go, except maybe ARM.

They're pretty much already dead.
Intel still hasn't given then the Nehalem license.

Even if Nvidia makes one, they'll still lose some way with Nehalem because

1) Big loss in memory bandwidth and performance due to having to communicate through the DMI which is only 2GB/s

OR

2) Power consumption and cost disadvantages of creating another memory controller and/or having to add much more robust TurboCache to compensate for lack of memory bandwidth through DMI
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
Yes, but their 9400 chipsets for C2D have been very successful on midrange and high end systems. Virtually all macs ship with them now.
 

ilkhan

Golden Member
Jul 21, 2006
1,117
1
0
As has been mentioned, adding the onboard GPU doesn't disable the ability to add a discrete GPU. Why would Intel even WANT to remove the ability to add a discrete GPU? They make money off the CPUs and larrabee is going to work over a standard PCI-E link anyway.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: VirtualLarry
Just thinking about how Intel could conceivably corner the market on graphics. What if they ONLY released their newer CPUs with an IGP integrated, thus cutting AMD/ATI and NV out of the market totally?

OEM computers have been like this for more than a decade. Intel is already the top seller of graphics chips, and HP and Compaq computers will black screen crash if you plug in a video card. An emachines computer I bought didn't even have an AGP slot. It's such a rarity to be able to add your own video card that "open AGP" or "open PCI-express" is advertised as a feature. To me that sounds as stupid as advertising a car that includes free doors, but that's the way OEM computers work.

AMD seems more aggressive with this approach. Even if you build your own AMD system, it will probably have integrated graphics.
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
I like the integrated graphics in my AMD board. It gives me SurroundView if I ever need to hook up 4 monitors simultaneously(probably not) but also the ability to move my video card to a new computer without rendering this one useless.
 

Ika

Lifer
Mar 22, 2006
14,264
3
81
Originally posted by: Lothar
Originally posted by: jordanclock
I think that would cause a whole mess of anti-trust violations, so it's a very moot point.

Why would that be an anti-trust violation?

Seems just like the EU filing anti-trust complaints against Microsoft for bundling IE8.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: drizek
Yes, but their 9400 chipsets for C2D have been very successful on midrange and high end systems. Virtually all macs ship with them now.

I thought apple was dumping them due to graphics chip failures.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Of course they will do this eventually , except that the "IGP" portion will be the minimal amount of hardware needed on-die to support display functions, with all of the 3d rendering done by a massive Larrabee-based vector unit either exposed as another core or as anextension available to all x86 cores(ie. SSE6 or something similar).
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
I have a bit of an update that relates to this. I tried to put a GeForce 7950GT in a friend's Compaq computer and I found out it doesn't fit. The computer uses standard size mounting brackets and it has a standard size gigabit ethernet card in it, but I cannot put a standard size video card in it. The PCI-e slot is there in such a way that the video card does not fit into the slot. I'm not even making this shit up.

I tried to make a video of it but it doesn't show too much. link.
You can also check google to see who else has bitched about this exact problem. Compaq video card doesn't fit

So there you have it. Intel already has a complete stranglehold on shitty OEM computers. Intel graphics or nothing. GeForce 7950GT simply does not fit.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: alyarb
8x slot?

I thought so too since I had that problem with AGP slots, but it's actually a 16x slot. The slot is too close to the edge of the case so the mounting bracket keeps the card tilted at an angle.If I try to put the card in straight (like ever other computer I have) then the pins don't line up.
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
heh looks like you'll have to take the i/o shield off that 7950. can't blame intel though. i've had OEM PCs with no expansion slots at all. my sister's dimension 4600 comes to mind (a single 32bit PCI with no "opening" for i/o).
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: alyarb
heh looks like you'll have to take the i/o shield off that 7950. can't blame intel though. i've had OEM PCs with no expansion slots at all. my sister's dimension 4600 comes to mind (a single 32bit PCI with no "opening" for i/o).

I don't blame Intel for Compaq's screw ups. I'm just stating that this possibility of Intel having total video card dominance in Intel systems is already true, but for different reasons. There's nothing in the chipset that says an AMD or Nvidia card won't work, but the cards simply do not work in most OEM computers. My friend's Compaq has the slot design problem, your sister's computer has the same basic problem, my old emachines had a 120W PSU and it would black screen crash when I had a Radeon 7200 PCI plugged in. This is pathetic. If you need to upgrade an OEM computer's video card because you think the Intel Extreme graphics is too extreme, you need to buy a new case, a new PSU, and possibly a new motherboard. You basically need a whole new computer.

I know PC fans like myself tend to attack Apple computers for being throw away toys that can't be upgraded, but a standard OEM PC is the exact same.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I believe this will be the case in some regard. But I am sure like an IGP. It will be possible to disable it for a discrete graphics processor.