GeForce 8950GX2 (D8E) on ABIT IP35 Pro - PCI-E 2.0 Concerns

AuDioFreaK39

Senior member
Aug 7, 2006
356
0
0
twitter.com
As most rumors have it, Nvidia's new D8E card is expected to launch this January as two G92 chips sandwiched together (basically the next-gen 7950GX2).

http://www.fudzilla.com/index....view&id=4299&Itemid=34

Since this card will essentially be an SLi on one chip, and the D92 chipset supports PCI-Express 2.0, would it make any performance difference to have an X38 motherboard (PCI-E 2.0) for this card? Right now I have an ABIT IP35-Pro (PCI-E 1.0a?), and I just bought an eVGA 8800Ultra a week ago for Crysis, and so I can do a Step-Up to the new card in January. Anyways, please let me know your opinions. I really don't want to get ripped off when I find out that I don't have enough bandwidth for this new card, and that my 8800Ultra would have performed equally. Thanks in advance.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I'm pretty sure that you'll be fine with a the IP35 Pro, as I don't think PCIe 2.0 has any real world benefits over PCIe 1.1 yet. You and I are both going to be curious to see how this D8E dual G92 card performs against a single GTX or Ultra, especially at higher resolutions (1920x1200 and up) with some AA thrown into the mix.
 

AuDioFreaK39

Senior member
Aug 7, 2006
356
0
0
twitter.com
Originally posted by: lehpron (evga.com forums)
Since the last 7950GX2 was made from dual high end laptop gpu's, not desktop G71 cores from the 7900GTX; I feel the dual G92 video card will be made from two of the recently launched GeForce 8800M GTX gpu's, each slapped on its own PCB, like the 7950GX2. The specs for the 8800M are exactly the same as the old G80 GTS, 96sp's at 500/1350/1600 mhz, but using less power because it's based on G92. Google benches of stock 8800GTS (G80 - revision 1) in SLi to give you a good idea for the dual gpu card againt your Ultra. I'll estimate based on sp's and core clocks, it would be at least 15% faster than your Ultra at 650.

^ Any opinions on this?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
pcie2 is good if the cards have to communicate with eachother through the southbridge... with two cards sandwiched together they are communicating with each other directly... so pcie2 should have no benefit for them (except for some extreme cases you will probably not get in a game)
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Well, if the GX2 is anything like SLI, I don't really want it. I don't like the way SLI scales in some titles. Looks like I might be waiting longer than I thought if this GX2 is supposed to be the second coming.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: AuDioFreaK39
Originally posted by: lehpron (evga.com forums)
Since the last 7950GX2 was made from dual high end laptop gpu's, not desktop G71 cores from the 7900GTX; I feel the dual G92 video card will be made from two of the recently launched GeForce 8800M GTX gpu's, each slapped on its own PCB, like the 7950GX2. The specs for the 8800M are exactly the same as the old G80 GTS, 96sp's at 500/1350/1600 mhz, but using less power because it's based on G92. Google benches of stock 8800GTS (G80 - revision 1) in SLi to give you a good idea for the dual gpu card againt your Ultra. I'll estimate based on sp's and core clocks, it would be at least 15% faster than your Ultra at 650.

^ Any opinions on this?

Yeah, here is mine. What a fricken joke. If I had to chose between GTS 640MB SLI to a single Ultra... Gimme the Ultra hands down. 15% is negligable, IMO, especially when some titles could be 50% slower than he ultra. Then you have to worry about certain bugs and all that junk... Not for me.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
don't forget the drivers lag... microsoft promises a refresh to vista.... how many years till SLI works on that?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: AuDioFreaK39
Originally posted by: lehpron (evga.com forums)
Since the last 7950GX2 was made from dual high end laptop gpu's, not desktop G71 cores from the 7900GTX; I feel the dual G92 video card will be made from two of the recently launched GeForce 8800M GTX gpu's, each slapped on its own PCB, like the 7950GX2. The specs for the 8800M are exactly the same as the old G80 GTS, 96sp's at 500/1350/1600 mhz, but using less power because it's based on G92. Google benches of stock 8800GTS (G80 - revision 1) in SLi to give you a good idea for the dual gpu card againt your Ultra. I'll estimate based on sp's and core clocks, it would be at least 15% faster than your Ultra at 650.

^ Any opinions on this?

I imagine that the new D8E will be more like dual 8800GTs. Doesn't make any sense to make it anything less IMO. It will have to beat the "Radeon 3780 X2", which might be a bit of a challenge if they use two 8800M chips.
 

Dkcode

Senior member
May 1, 2005
995
0
0
Hmmm. Doesn't appear to be a great step up from the current crop of GTX/Ultra cards.

How well did the 7950GX2 fair over a single 7900GTX?

I will wait for further information and numbers before i pass further judgement.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Dkcode
Hmmm. Doesn't appear to be a great step up from the current crop of GTX/Ultra cards.

How well did the 7950GX2 fair over a single 7900GTX?

I will wait for further information and numbers before i pass further judgement.

At higher resolutions with AA the 7950GX2 was generally significantly faster then a 7900GTX.

http://www.anandtech.com/showdoc.aspx?i=2769&p=5

That makes sense though because they were both based on G7x cores, whereas the "8800GX2" will be based on the G92 and the 8800GTX is G80.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
The 8800M is a G92 based 96 shader gpu. So depending on the clock speed, an 8800GM is probably in line with a 8800GTS. Nobody said that the 8800GX2 will use the 8800M gpu, although that wouldn't be such a bad thing. And, I don't think the 8800GX2 will be two PCB's sandwiched together as the 7950GX2 was. I think it will be most similar to the R680 in design. The real question is, will this design bring about improvements to SLI where all of the onboard memory can be utilized, not half. Sounds like it will be a rather expensive card any way you look at it.
 

terentenet

Senior member
Nov 8, 2005
387
0
0
Originally posted by: nitromullet
Originally posted by: Dkcode
Hmmm. Doesn't appear to be a great step up from the current crop of GTX/Ultra cards.

How well did the 7950GX2 fair over a single 7900GTX?

I will wait for further information and numbers before i pass further judgement.

At higher resolutions with AA the 7950GX2 was generally significantly faster then a 7900GTX.

http://www.anandtech.com/showdoc.aspx?i=2769&p=5

That makes sense though because they were both based on G7x cores, whereas the "8800GX2" will be based on the G92 and the 8800GTX is G80.

That's BS. I've had 7900GTX SLI and I've had 7950GX2 SLI. Single card versus single card, the GX2 might win. But what about the scenarios where you really need GPU power? Where the GTX SLI can't handle the situation?
Solution, I went for the GX2. A single GX2 won't do, it's basically 2 GT's slapped together, forced into Split Scene Rendering Mode. Hands down, GTX SLI is way faster than GX2.
So, I went for 2 GX2's and done Quad SLI.
The drivers were so bad, that I was getting worse performance from Quad SLI than from a single GX2 card. That bad. I went back to 7900GTX SLI the next day.

Sooooo... if we care about performance, what we really want is 9800GTX, as a single card solution, 1.5-2 times faster than a 8800GTX, not this mumbo jumbo GX2 sh*t all over again.
Sure, they can build it for whoever wants a single card and 2 GPU's for use in P35 and X38 systems, but I'm one of the ones who want 790i and triple SLI 9800GTX's. That should RAPE crysis at 2560x1600 all very high and some AA.

EDIT: I must note, the main backdraw of the 7950GX2 card was the fact that you could not alter the way the 2 cores will render the image. They were stuck in Split Scene Rendering and there was no SLI profile that you could alter in order to change that.
With 2 separate cards, you can specify how you want the job done; Split Scene Rendering Mode(SFR), Alternate Frame Rendering 1 (AFR1) and Alternate Frame Rendering 2 (AFR2).
Depending on title and game engine, one way works better than the other 2.
If they build 8850GX2 cards and they let us have SLI profiles for a single card, that would be great and the card might actually be good for Quad SLI. I think it's a driver thing and by the way the driver departments are moving, it's not going to happen.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: MarcVenice
Wait, since when does p35 support SLI ?

Actually, one of the cool things about the 7950GX2 is that it had a built-in PCI-E bridge chip so that both GPUs could be used in a motherboard that had no SLi support. I assume the 8950GX2 would have a similar bridge.

EDIT 1: Compatibility between the 7950GX2 and non-SLi motherboards was decent but still not 100% because some non-SLi motherboards failed to implement the part of the PCI-E spec. that is required for those PCI-E bridge chips.

EDIT 2: Is "bridge" the correct term? Maybe it was called a "switch."
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: terentenet
Originally posted by: nitromullet
Originally posted by: Dkcode
Hmmm. Doesn't appear to be a great step up from the current crop of GTX/Ultra cards.

How well did the 7950GX2 fair over a single 7900GTX?

I will wait for further information and numbers before i pass further judgement.

At higher resolutions with AA the 7950GX2 was generally significantly faster then a 7900GTX.

http://www.anandtech.com/showdoc.aspx?i=2769&p=5

That makes sense though because they were both based on G7x cores, whereas the "8800GX2" will be based on the G92 and the 8800GTX is G80.

That's BS. I've had 7900GTX SLI and I've had 7950GX2 SLI. Single card versus single card, the GX2 might win. But what about the scenarios where you really need GPU power? Where the GTX SLI can't handle the situation?
Solution, I went for the GX2. A single GX2 won't do, it's basically 2 GT's slapped together, forced into Split Scene Rendering Mode. Hands down, GTX SLI is way faster than GX2.
So, I went for 2 GX2's and done Quad SLI.
The drivers were so bad, that I was getting worse performance from Quad SLI than from a single GX2 card. That bad. I went back to 7900GTX SLI the next day.

Sooooo... if we care about performance, what we really want is 9800GTX, as a single card solution, 1.5-2 times faster than a 8800GTX, not this mumbo jumbo GX2 sh*t all over again.
Sure, they can build it for whoever wants a single card and 2 GPU's for use in P35 and X38 systems, but I'm one of the ones who want 790i and triple SLI 9800GTX's. That should RAPE crysis at 2560x1600 all very high and some AA.

EDIT: I must note, the main backdraw of the 7950GX2 card was the fact that you could not alter the way the 2 cores will render the image. They were stuck in Split Scene Rendering and there was no SLI profile that you could alter in order to change that.
With 2 separate cards, you can specify how you want the job done; Split Scene Rendering Mode(SFR), Alternate Frame Rendering 1 (AFR1) and Alternate Frame Rendering 2 (AFR2).
Depending on title and game engine, one way works better than the other 2.
If they build 8850GX2 cards and they let us have SLI profiles for a single card, that would be great and the card might actually be good for Quad SLI. I think it's a driver thing and by the way the driver departments are moving, it's not going to happen.

What's BS? I've owned a 7900GTX, 7900GTX SLI, and a 7950GX2 as well... And, yes 7900GTX SLI is faster than the 7950GX2 (it should be), but the single GX2 was faster than a single GTX. I'm in agreement with you that what we really need is a 9800GTX though.
 

AuDioFreaK39

Senior member
Aug 7, 2006
356
0
0
twitter.com
So according to you guys, I wouldn't really have a reason to get the 8 series GX2 in January? My 8800Ultra is now OC'd to 660/2340 by the way. I just want to have the best performance possible in Crysis before the 9800 series is launched.