First glance at 3870X2 Crossfire X results.

TroubleM

Member
Nov 28, 2005
97
0
0
It looks really good for AMD. It's now clear that the drivers will make this a must ot bust for them.

And with the next gpu coming in multi-die form (the R7x0), this results are the first indications for future products performance. Go ATI!
 

djnsmith7

Platinum Member
Apr 13, 2004
2,612
1
0
Looks good...We'll have a better idea once the official drivers are released...
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Good thing they dropped the previous nomenclature, or else we would be seeing X1900XTX Crossfire X ... I mean, how many X do you really need?

Good stuff though, certainly better looking than the 7950GX2 fiasco
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: ShadowOfMyself
Good thing they dropped the previous nomenclature, or else we would be seeing X1900XTX Crossfire X ... I mean, how many X do you really need?

Good stuff though, certainly better looking than the 7950GX2 fiasco

The 7950GX2 was not a "fiasco"- I'm typing this on a GX2 equipped computer that has given me no problems since launch.

If you're referring to Quad SLi, it's not NVIDIA's fault that Windows XP had the limitation of 3 frames rendered ahead for DX9 and they had to use AFR/SFR in combination. (OpenGL games could use 4 way AFR)

Nonetheless, I was very happy with my Quad SLi until I made the transition to 8800GTX SLi, and I'm not alone in that:

http://www.firingsquad.com/har..._sli_update/page12.asp

Both ForceWare Quad SLI drivers were much more stable, although we?ve got to give the nod to the GeForce 7950 GX2?s ForceWare 91.37 drivers; we didn?t encounter a single lockup or crash with 91.37, and that?s definitely a good thing.

In closing, we?re encouraged by the progress NVIDIA has made with Quad SLI. With GeForce 7950 GX2 cards selling for a little over $100 more than your typical GeForce 7900 GTX, you can actually make an argument that the GeForce 7950 GX2 is a better value.

As DX9 in Vista is not constrained by the old 3 frame limit, I'm told 7950GX2 Quad offers better performance now in many games through 4 way AFR.

It's not a comparable situation with Vista, the two cannot be compared.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: TroubleM
It looks really good for AMD. It's now clear that the drivers will make this a must ot bust for them.

And with the next gpu coming in multi-die form (the R7x0), this results are the first indications for future products performance. Go ATI!

I think it's a bit early to say it 'looks good'. You have two games benched, FEAR with 77% scaling and CoJ with 50% scaling... Then you have 16% scaling in 3DMark06, which has normally been a strong point for the Radeons. FEAR already runs at 117 fps at 1600x1200 with a single card in their test, so I'd argue that there isn't much need for the second one. I'm more interested in seeing the differences in more games like CoJ that actually improve the end user's experience before I start patting ATI on the back.

Originally posted by: nRollo
If you're referring to Quad SLi, it's not NVIDIA's fault that Windows XP had the limitation of 3 frames rendered ahead for DX9 and they had to use AFR/SFR in combination. (OpenGL games could use 4 way AFR)

...but it is their fault for releasing a product for an OS with that limitation. I'm quite sure that by the time the 7950GX2 launched that NVIDIA was quite familiar with how DX9 functioned in WinXP, yet they choose to launch Quad-SLI anyway.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Shouldn't Nvidia thought about Windows XP had the limitation of 3 frames rendered ahead for DX9? Windows is PC gamers platform. Of course it's Nvidia's fault.
 

ghost recon88

Diamond Member
Oct 2, 2005
6,196
1
81
I say screw 3DMark06, the game performance increase is whats important to me. When Futuremark comes out with a more updated 3DMark, then I'll start looking at it again. I can't wait to see how these cards performance improves with later drivers :)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
If you're referring to Quad SLi, it's not NVIDIA's fault that Windows XP had the limitation of 3 frames rendered ahead for DX9 and they had to use AFR/SFR in combination.
Perhaps not, but it is their fault for not releasing Vista drivers for Quad SLI for months. In fact they only released them when Tri-SLI arrived.

Somebody who forked over big money for a pair GX2 cards would be pissed off.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Azn
Shouldn't Nvidia thought about Windows XP had the limitation of 3 frames rendered ahead for DX9? Windows is PC gamers platform. Of course it's Nvidia's fault.

If you read the benchmarks in the review I linked to, you'll notice that at 19X12 8X16X Quad SLi was the thing to have.

I enjoyed running my games at that setting or 19X12 16X16X while using Quad SLi back then-
and nothing else could have given me that gaming experience.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
If you're referring to Quad SLi, it's not NVIDIA's fault that Windows XP had the limitation of 3 frames rendered ahead for DX9 and they had to use AFR/SFR in combination.
Perhaps not, but it is their fault for not releasing Vista drivers for Quad SLI for months. In fact they only released them when Tri-SLI arrived.

Somebody who forked over big money for a pair GX2 cards would be pissed off.

Or they might have just stuck with XP.

I guess we'll never know how many people with GX2 Quad wanted to go with Vista and had to wait, but something tells me it's likely people who sprang for the most expensive video set of Summer 2006 may well have updated to 8800s before Winter 2007.

Oftentimes the people who buy the highest end stuff keep up with the high end rather than skipping generations, and I could imagine someone who wanted what Quad had to offer buying DX10 cards before upgrading to Vista.

Just a hunch. ;)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Or they might have just stuck with XP.
Uh huh, the same people you mention below that want cutting edge technology?

I guess we'll never know how many people with GX2 Quad wanted to go with Vista and had to wait, but something tells me it's likely people who sprang for the most expensive video set of Summer 2006 may well have updated to 8800s before Winter 2007.
If they got free cards from nVidia they might well have, but somebody that actually paid for them is unlikely to drop them so quickly just because nVidia?s too lazy to support their ?ultimate gaming experience?.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Rollo loves turning legitimate threads into flamefest like this...

Honestly very upsetting that this is allowed again here...to me if the thread doesn't relate to nV, he needs to stay out.

On topic, i'm not a big fan of the 3870X2, nor Crossfire X2, but i must say they've done a very decent job with the drivers, better than i woulda expected.

Hopefully we can see even more improvement over time...might make dual GPU card/solutions a semi-decent option someday.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: n7
Rollo loves turning legitimate threads into flamefest like this...

Honestly very upsetting that this is allowed again here...to me if the thread doesn't relate to nV, he needs to stay out.

On topic, i'm not a big fan of the 3870X2, nor Crossfire X2, but i must say they've done a very decent job with the drivers, better than i woulda expected.

Hopefully we can see even more improvement over time...might make dual GPU card/solutions a semi-decent option someday.


Yep I agree with everything you say. Nice gains, but I really don't want 4 gpu's on 2 cards in my case. Still this may be the future.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: ronnn
Originally posted by: n7
Rollo loves turning legitimate threads into flamefest like this...

Honestly very upsetting that this is allowed again here...to me if the thread doesn't relate to nV, he needs to stay out.

On topic, i'm not a big fan of the 3870X2, nor Crossfire X2, but i must say they've done a very decent job with the drivers, better than i woulda expected.

Hopefully we can see even more improvement over time...might make dual GPU card/solutions a semi-decent option someday.


Yep I agree with everything you say. Nice gains, but I really don't want 4 gpu's on 2 cards in my case. Still this may be the future.


To each their own Ronn, but for people with large monitors gains of 50-76% over a single 3870X2 may well be worth the cost of two. (and the only way that level of performance can be achieved)

I reiterate, this bodes well for upcoming multi GPU solutions for multi GPU solutions in Vista in general.
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
58
91
Originally posted by: ronnn
Originally posted by: n7
Rollo loves turning legitimate threads into flamefest like this...

Honestly very upsetting that this is allowed again here...to me if the thread doesn't relate to nV, he needs to stay out.

On topic, i'm not a big fan of the 3870X2, nor Crossfire X2, but i must say they've done a very decent job with the drivers, better than i woulda expected.

Hopefully we can see even more improvement over time...might make dual GPU card/solutions a semi-decent option someday.


Yep I agree with everything you say. Nice gains, but I really don't want 4 gpu's on 2 cards in my case. Still this may be the future.

why not? i mean if i can afford it, it's not a bad price/performance for high end graphics.
thats similar to saying "i really don't want a dual core cpu in my case, as opposed to one"

i must say, dual 3870x2 cards scale pretty well, much better than i expected to see


 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
Knowing what comes from NV next is a sandwich'ed G92, I have this dreaded feeling that 8800GTX/Ultra will actually last 2, yes 2 full years as the top performing single-GPU card. Meaning that 9800GX2 will carry NV over to the fall refresh, and we'll see the real successor to G80 in October/November time frame, in G100 with 1GB of frame buffer. Absolutely pathetic state of competition, on which I can only blame AMD..

Considering that NV didn't hasten to release 9800GX2 to spoil the launch of 3870X2, it's possible that NV is working on something similar to what AMD have done with the drivers. I expect that 9800GX2 will also have all the innovation that AMD has employed in 3870X2. (Such as multi-monitor support, audio pass-through, OS recognizes it as a single card, etc.) In other words, what I'm seeing is both companies are trying to make the dual-GPU cards as transparent as possible to the OS, games, users, etc.

Still the future support of these dual-chip cards will be, at the very least, heavily dependent on the performance of their true next gen chips as well competition. Both companies have shown terribly opportunistic behaviors in the past and users should NOT expect any kind of future-proofing, as always.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: lopri
Knowing what comes from NV next is a sandwich'ed G92, I have this dreaded feeling that 8800GTX/Ultra will actually last 2, yes 2 full years as the top performing single-GPU card. Meaning that 9800GX2 will carry NV over to the fall refresh, and we'll see the real successor to G80 in October/November time frame, in G100 with 1GB of frame buffer. Absolutely pathetic state of competition, on which I can only blame AMD..

Considering that NV didn't hasten to release 9800GX2 to spoil the launch of 3870X2, it's possible that NV is working on something similar to what AMD have done with the drivers. I expect that 9800GX2 will also have all the innovation that AMD has employed in 3870X2. (Such as multi-monitor support, audio pass-through, OS recognizes it as a single card, etc.) In other words, what I'm seeing is both companies are trying to make the dual-GPU cards as transparent as possible to the OS, games, users, etc.

Still the future support of these dual-chip cards will be, at the very least, heavily dependent on the performance of their true next gen chips as well competition. Both companies have shown terribly opportunistic behaviors in the past and users should NOT expect any kind of future-proofing, as always.

Who cares about a single GPU successor when people aren't buying a high end card 'because it has one GPU' they are buying the package and the 3870X2 is the highest performing *card* that you stick in a PCI-E slot and it runs....The fact is has two GPU's onboard is mute considering there is no differentiation by the driver or OS and its seen as a 'single' card solution- the G80 has ended.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Sylvanas
Who cares about a single GPU successor when people aren't buying a high end card 'because it has one GPU' they are buying the package and the 3870X2 is the highest performing *card* that you stick in a PCI-E slot and it runs....The fact is has two GPU's onboard is mute considering there is no differentiation by the driver or OS and its seen as a 'single' card solution- the G80 has ended.

The problem with your theory is Crossfire doesn't scale in all games, and only allows for one method of AFR as a fal lback if a game is not profiled.

So why is this a problem? If the AFR fall back doesn't work, and a game is not profiled, a 3870X2 performs worse than a 3870. The only way around that is to disable Catalyst AI, which also disables all fixes and optomizations for a game.

So the era of high end single GPU solutions will not be at an end, because there is a big difference between a solution like and a single GPU. (made bigger by this limitation of Catalyst drivers)



Edit:
We don't know what the GX2 will be of course, but if it follows the same path as other NVIDIA drivers, it will allow users to pick AFR1, AFR2, SFR, or single card as the fall back for non profiled games, and let the user create or edit profiles. This would limit the problems associated with multi gpu. Of course ATi may offer similar at some point which would make your argument closer to correct.
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,256
126
Originally posted by: nRollo
The problem with your theory is Crossfire doesn't scale in all games, and only allows for one method of AFR as a fal lback if a game is not profiled.

It doesn't scale well in all games but in most reviews I've seen, Crossfire scales better than SLI. And I'm pretty certain SLI doesn't scale well in ALL games.

And from those Crossfire X results it seems ATI has done a decent job of getting the scaling to work even with 4 GPUs.
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
58
91
Originally posted by: lopri
Knowing what comes from NV next is a sandwich'ed G92, I have this dreaded feeling that 8800GTX/Ultra will actually last 2, yes 2 full years as the top performing single-GPU card. Meaning that 9800GX2 will carry NV over to the fall refresh, and we'll see the real successor to G80 in October/November time frame, in G100 with 1GB of frame buffer. Absolutely pathetic state of competition, on which I can only blame AMD..

if the 8800ultra is going to be praised for being the fastest single gpu card until further notice, why don't people still praise the FX-57 as the fast single core cpu for the past 4 years or whatever it is.

who cares if its 2 gpu's its on one pcb, takes one pci-e, can work on any pci-E mb.. etc etc etc alike any single gpu card.