Anyone else excited over the prospect of flip chip GPU designs?

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Well according to AT (http://www.anandtech.com/video/showdoc.aspx?i=2570) nVidia is working on a flip chip design and motherboard that will incorporate the GPU. Their vision is to make the GPU the center of the system and the CPU as a complimentary component - just how it should be. If done correctly, this could be awesome for consumers and PC gaming as a whole. So many people that can't afford $200-$250 midrange cards would be in a position to be able to shell out say $75-$100 for a GPU core and be done with their upgrade. This would raise the bar on the minimum requirements for PC games since the lowest common denominator would be pretty close in features and performance to the high end at a fraction of the cost. I really hope nVidia can pull this off, if they do then it will be one of the biggest breakthroughs for this industry in a long long time.
 

lifeguard1999

Platinum Member
Jul 3, 2000
2,323
1
0
Of course, ATI is working on something similar. Don't get your hopes up for a GPU core that can be switched out like a CPU anytime soon.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
While it may not happen overnight for high end cards, it is certainly feasible for midrange and lower end cards that typically require less sophisticated memory layouts.
 

DrZoidberg

Member
Jul 10, 2005
171
0
0
I dont think putting a GPU socket into the motherboard is a good idea. Cause today's video cards require video memory at speeds greater than current RAM dimms.
I read that GPU ram like GDDR3 is only able to run at 1200mhz-1500mhz because they are soldered onto the video card board and not detachable like RAM.

However if nVidia or ATI were able to have a universal pcb that would be cool, then board makers like evga, winfast etc.. can just use same pcb like 6800 and put a 7800 or 8800 gpu. They wont need to research, design new pcb with each new gpu core, so save costs and cheaper video cards for consumer.

 

Snakexor

Golden Member
Feb 23, 2005
1,316
16
81
why make it sound like nvidia is the only one doing this....there is a thread on the front page of the video forum that is pointing to ati doing it as well
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Snakexor
why make it sound like nvidia is the only one doing this....there is a thread on the front page of the video forum that is pointing to ati doing it as well


From the AT article about the ATi flipchip:

However, your ATI video card will most likely come with a socket flip chip from ATI. Unlike the NVIDIA socket prototype we commented about yesterday, the R580 socket is geared specifically for a PCIe graphics adaptor, rather than a motherboard-housed GPU socket.

 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
I got a feeling that the socketed GPU will be coupled with some slow motherboard RAM which will heavily cripples performance, unless Nvidia/ATI integrates the video RAM onto the chip itself, which is possible but unlikely.
 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
I doubt anything becomes of this.... even Intel and AMD change their sockets constantly. Every new video card I've owned has had double the memory of the previous one. Too many things change every year with GPU's and this presents their board partners with a real nightmare... allowing the user to mess with the GPU and hope they get the cooler on right. Not to mention passing on the socket expense to the board companies.

BTW, All video card companies already have a handful of socketed boards that the use for chip testing during chip bringups.
 

stnicralisk

Golden Member
Jan 18, 2004
1,705
1
0
It will be good for mid and low end performance. It could make gaming more mainstream on PCs as it would cut out some of the throw away costs for these companies. The motherboard could come with a specific speed RAM dedicated for the GPU even a 256mb stick of DDR2 running at 500mhz (pretty standard) wouldnt cripple a more common card such as a x600 or 6600.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
Originally posted by: jasonja
I doubt anything becomes of this.... even Intel and AMD change their sockets constantly. Every new video card I've owned has had double the memory of the previous one. Too many things change every year with GPU's and this presents their board partners with a real nightmare... allowing the user to mess with the GPU and hope they get the cooler on right. Not to mention passing on the socket expense to the board companies.

BTW, All video card companies already have a handful of socketed boards that the use for chip testing during chip bringups.
For once, I completely agree with you. The idea might seem ok at first, but when you think of all the changes you have to make each generation (outdated socket) and then the RAM having to be the same between gens...that doesn't happen enough to make this feasible. Plus, you increase all of the combinations of RAM + GPU that already plague the market and make it confusing for the consumer. The only decent use of this would be some sort of internal HyperMemory/TurboCache card that could directly plug into the mobo and replace bottom end graphics with something that could run at least a few games. Other than that, keep the functionality of RAM + GPU on one chip.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,393
8,552
126
a) flip chip doesn't have much to do with whether the chip can be socketed or not. flip chip really means that the chip is on top of the substrate material, rather than on the bottom. graphics chips are already made this way. you could have socketed regular PGA chips or socketed DIP chips or whatever.

b) sounds lame. as was pointed out in the comments and the ati blurb, graphics parts are only capable of using as fast ram as they do because the ram is soldered directly to the board. even if you could upgrade the graphics chip itself, you'd still be stuck with the same ram type as before. while the 2-2-2 DDR400 winbond ram you bought for your nforce 2 based athlon xp system back in 2002 still works just fine, i doubt you'd get anywhere near that longevity with a graphics part. so, you'd probably have to upgrade your ram to get any increase in performance. and, because you won't be able to get the same speed out of the ram, you'd have to make it even more parallel. dual channel ram is only 64 bits, iirc. graphics are typically 128 for just the midrange. so you'd need 4 channels. that increases motherboard complexity and cost to incredible levels. no, a far more cost effective solution is discrete graphics.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
I will only like this idea if it makes graphics solutions cheaper, to me it sounds another way for them to charge us more because of complex memory subsystems now needing to be on the motherboard.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
I like the idea but I think it should be coupled with a Dual Slot solution for GDDR3/4 Sticks so you can decide on the amount of Memory it has. 2 Small but efficient memory slots next to it could give it high performance and replace current IGP designs.

This isn't a case of Flip Chip GPU's but actually having a Flip Chip GPU on the Motherboard itself.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: ZobarStyl
Originally posted by: jasonja
I doubt anything becomes of this.... even Intel and AMD change their sockets constantly. Every new video card I've owned has had double the memory of the previous one. Too many things change every year with GPU's and this presents their board partners with a real nightmare... allowing the user to mess with the GPU and hope they get the cooler on right. Not to mention passing on the socket expense to the board companies.

BTW, All video card companies already have a handful of socketed boards that the use for chip testing during chip bringups.
For once, I completely agree with you. The idea might seem ok at first, but when you think of all the changes you have to make each generation (outdated socket) and then the RAM having to be the same between gens...that doesn't happen enough to make this feasible. Plus, you increase all of the combinations of RAM + GPU that already plague the market and make it confusing for the consumer. The only decent use of this would be some sort of internal HyperMemory/TurboCache card that could directly plug into the mobo and replace bottom end graphics with something that could run at least a few games. Other than that, keep the functionality of RAM + GPU on one chip.

But what is the point of that? If you need to upgrade your integrated graphics, you buy a video card. Something like 90% of those systems will go to businesses that will never really need 3d or play games on then so it's just adding complexity and cost for a feature few if any will ever use in the real world.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Well, I was speaking of when NV had done it, not who was first. I think we all remember when the 5600U came out and was disappointing, and then the flip-chip version came out and was a lot better.

I guess the 5800U was also.
 

Elcs

Diamond Member
Apr 27, 2002
6,278
6
81
Might be useful for home systems to upgrade from a poor onboard graphics chip to a very slightly improved one with extra little optimisations like HDTV decoding or something.

(Im thinking along the lines of people changing their current desktop pc's with onboard graphics into a movie player or TV system. Additions of improvements like decoding abilities to load the GPU rather than CPU may be useful in that case. Presuming this style of GPU is cheaper than normal expansion card designs)
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Making the gpu the central component of a system? Maybe in a game console, but I happen to use my system for a lot more than gaming, and such a move would be 1) unnecessary, and 2) possibly trade-off cpu performance for the gpu, which is the last thing I need.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: munky
Making the gpu the central component of a system? Maybe in a game console, but I happen to use my system for a lot more than gaming, and such a move would be 1) unnecessary, and 2) possibly trade-off cpu performance for the gpu, which is the last thing I need.

i like ati's implementation much better. . . . on the pciE card . . . seems more flexible than on the MB
 

jasonja

Golden Member
Feb 22, 2001
1,864
0
0
Originally posted by: ZobarStyl
Originally posted by: jasonja
I doubt anything becomes of this.... even Intel and AMD change their sockets constantly. Every new video card I've owned has had double the memory of the previous one. Too many things change every year with GPU's and this presents their board partners with a real nightmare... allowing the user to mess with the GPU and hope they get the cooler on right. Not to mention passing on the socket expense to the board companies.

BTW, All video card companies already have a handful of socketed boards that the use for chip testing during chip bringups.
For once, I completely agree with you. The idea might seem ok at first, but when you think of all the changes you have to make each generation (outdated socket) and then the RAM having to be the same between gens...that doesn't happen enough to make this feasible. Plus, you increase all of the combinations of RAM + GPU that already plague the market and make it confusing for the consumer. The only decent use of this would be some sort of internal HyperMemory/TurboCache card that could directly plug into the mobo and replace bottom end graphics with something that could run at least a few games. Other than that, keep the functionality of RAM + GPU on one chip.


I don't see the real low end space ever using sockets for this either because the sockets would cost almost as much as the low end GPU ASIC they'd put in them. PC builders will do almost anything to save a few nickels here and there.... solder will always be cheapier and therefore they will probably use it. Lets not forget that PC builders don't really want you upgrading... they'd much rather you just bought a new computer (hopefully from them) in a year or two.
 

Cheesetogo

Diamond Member
Jan 26, 2005
3,824
10
81
Originally posted by: lifeguard1999
Originally posted by: apoppin
i like ati's implementation much better. . . . on the pciE card . . . seems more flexible than on the MB

Agreed.

I think ATi's soulution is better too - I don't like the idea of the graphics card being part of the motherboard.
 

Auric

Diamond Member
Oct 11, 1999
9,591
2
71
Originally posted by: Cheesetogo

I think ATi's soulution is better too - I don't like the idea of the graphics card being part of the motherboard.

Either way it's prolly more of a feature geared to each company's customers rather than end-users. I recall Rendition worked on such a thing but it was cancelled some time after being aquired by Micron. Or sumfing.

 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Originally posted by: ElFenix
a) flip chip doesn't have much to do with whether the chip can be socketed or not. flip chip really means that the chip is on top of the substrate material, rather than on the bottom. graphics chips are already made this way. you could have socketed regular PGA chips or socketed DIP chips or whatever.
This is a very important point. The writer of the original anandtech article seems to know know his terminology. Video cards are already flip chip and have been for a long time.