ATI to copy the GX2?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: BFG10K
But rather pointless ATM, unless there's a super price involved..
It's a good way to get Super AA without investing in Crossfire.

Or get a 8800GTS if the price is 400~.
You will get performance/IQ plus it will be a single card. No crossfire limitations either.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Or get a 8800GTS
That depends on many things. For one you get full screen super-sampling with Super AA but not currently on G80.

I'm not saying I'd go that route but Super AA on one board is mighty tempting.

No crossfire limitations either.
Super AA doesn't have any Crossfire limitations (other than multiple monitors of course).
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: BFG10K
Or get a 8800GTS
That depends on many things. For one you get full screen super-sampling with Super AA but not currently on G80.

I'm not saying I'd go that route but Super AA on one board is mighty tempting.

No crossfire limitations either.
Super AA doesn't have any Crossfire limitations (other than multiple monitors of course).

Vsynce and triple buffering no? How about crossfire profiles?

I think SSAA will be back (well xS isnt really SSAA but a mix) since its a part of DX10 specs.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Vsynce and triple buffering no? How about crossfire profiles?
I don't know so much about the CrossFire profiles (if there are any "profiles" or if it is a setting of the CAT A.I. to "Advanced" that does it), but from what I hear CrossFire doesn't have the Vsync problems that SLI sometimes has.

I'd currently go with SLI over CrossFire though.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
I think I'd prefer 8800GTS for obvious reasons any day of the week.. Too late to the party..
And I as well believe that xS modes will be added soon to the Forceware X driver..
 

Dainas

Senior member
Aug 5, 2005
299
0
0
Originally posted by: tanishalfelven
Originally posted by: Dainas
These solutions are always boring because they never bother to put topend cores.. I mean for crissake atleast use XT chips. The savings in power draw doesnt measure to the performance lost by using x1900pro gpus.

can you imagine the cooler they would have to use.


true, but by now they must have some cherry cores, or a ever so slight clock decrease like the GX2. Even running 30mhz slower, a XT core would flatline a pro.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
QFT. I had to use an X1900GT for a little while between my X1900XTX replacement. Even when I had the GT clocked way further than an XT(X) it still couldn't make up the muscle.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Vsynce and triple buffering no? How about crossfire profiles?
No - when Super AA is enabled the cards behave as one board. Each renders the same scene with a jitter on the AA samples and the two AA patterns are combined from each board.

Like I said the only limitation would be multiple displays (or lack thereof).
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Hey folks, let me put this "Copy NVIDIA" thing to rest. ATI has been doing this natively far longer than nvidia has. At mentioned the first commercial example was the MAXX. But for the record, ever since I believe the Rage I chip, ATI has had multi-GPU built right in. There just has been never any need for it.

If you do a little reading, the R500 core has always had the ability to be multi-GPU up to 128 ways. It's just a matter of developing the right driver and the proper market. Why spend the money when the 1950XTX is still top of the line?

(Note: I am a nvidia 8800 user, I have an X1950XTX sitting here next to me too)
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: SunnyD
Hey folks, let me put this "Copy NVIDIA" thing to rest. ATI has been doing this natively far longer than nvidia has. At mentioned the first commercial example was the MAXX. But for the record, ever since I believe the Rage I chip, ATI has had multi-GPU built right in. There just has been never any need for it.

If you do a little reading, the R500 core has always had the ability to be multi-GPU up to 128 ways. It's just a matter of developing the right driver and the proper market. Why spend the money when the 1950XTX is still top of the line?

(Note: I am a nvidia 8800 user, I have an X1950XTX sitting here next to me too)

I heard even the 9700 PRO had that capability, they used up to 127 GPU's as a renderfarm to render something that I can't remember now, was quite a while.