Crossfire previews

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

0roo0roo

No Lifer
Sep 21, 2002
64,795
84
91
BTW- my point here is that 16X12 isn't a "middling resolution"

it is. those dell 2405FPW widescreens filtering down to average gamers/users now are already 1920 x 1200. sli is for high end gamers/users.
 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
Originally posted by: apoppin
Originally posted by: JackBurton
Originally posted by: apoppin
Originally posted by: BFG10K
that is way beyond speculation to the point of ridiculous. . . clearly it will be sli'd
You sure about that? What if it has dual slot cooling?

I don't know as I don't have the official specs.

as certain as i have ever been without seeing them . . . [why?

because] SLI would be a "failure" if nVidia offered it only for the 6800s . . . . ;)
[ati wins]

g70 does look to be single slot on the smaller die . . . .
[however] EVEN IF their top card is dual slot, you can be sure nVidia will come up with a way to sli it . . . . [expensive and liquid-metal cooled, no doubt :p]

absolutely positively certainly their "GT" will NOT be be dual-slot and would be an excellent "budget" candidate for sli.

i really DO expect the g70 to be a smaller-cooler card than the 6800 series . . . nVidia did not design their SLI for the 6800s . . . rather the "future" and we DO know SLI-2 ["AMR Killer"] is in "the works"



edit: think logically what SLI - 2 will do . . . give gamers what they want:
1) ability to use two cards with different BIOSes and maybe different generations
2) Supertile and other refinements that ati is using
3) Allow for SLI changes on-the-fly without rebooting
4) OF course, 16 lanes PCIX for EACH card
5) Multimonitor support with sli and address all the other complaints

of course ATI will follow with AMR-2

You don't have to be psychic to see where things are going - outrageously expensive :(
[and i am going to console gaming]
:brokenheart:

edited again and again :p
I feel another edit coming. ;)

how does it feel?

i posted between that final edit and going to bed last night . . . nothing further has changed.
:roll:

Number 4. PCIX? NVidia isn't making any PCIX cards. PCI-E (or PCX) yes, but not PCIX.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: JackBurton
Originally posted by: apoppin
Originally posted by: JackBurton
Originally posted by: apoppin
Originally posted by: BFG10K
that is way beyond speculation to the point of ridiculous. . . clearly it will be sli'd
You sure about that? What if it has dual slot cooling?

I don't know as I don't have the official specs.

as certain as i have ever been without seeing them . . . [why?

because] SLI would be a "failure" if nVidia offered it only for the 6800s . . . . ;)
[ati wins]

g70 does look to be single slot on the smaller die . . . .
[however] EVEN IF their top card is dual slot, you can be sure nVidia will come up with a way to sli it . . . . [expensive and liquid-metal cooled, no doubt :p]

absolutely positively certainly their "GT" will NOT be be dual-slot and would be an excellent "budget" candidate for sli.

i really DO expect the g70 to be a smaller-cooler card than the 6800 series . . . nVidia did not design their SLI for the 6800s . . . rather the "future" and we DO know SLI-2 ["AMR Killer"] is in "the works"



edit: think logically what SLI - 2 will do . . . give gamers what they want:
1) ability to use two cards with different BIOSes and maybe different generations
2) Supertile and other refinements that ati is using
3) Allow for SLI changes on-the-fly without rebooting
4) OF course, 16 lanes PCIX for EACH card
5) Multimonitor support with sli and address all the other complaints

of course ATI will follow with AMR-2

You don't have to be psychic to see where things are going - outrageously expensive :(
[and i am going to console gaming]
:brokenheart:

edited again and again :p
I feel another edit coming. ;)

how does it feel?

i posted between that final edit and going to bed last night . . . nothing further has changed.
:roll:

Number 4. PCIX? NVidia isn't making any PCIX cards. PCI-E (or PCX) yes, but not PCIX.

thanks for pointing out the typo . . . it's nVidia PIXiE i guess.

gonna go head-to-head with RubiE

:D

edited [again]
:roll:



 

Todd33

Diamond Member
Oct 16, 2003
7,842
2
81
Originally posted by: Rollo
Where I think the niche for this will be is the X800XL AMR.

Who will pay $570 for a X850 master and $400+ for an X850 when they can get two G70s for a little more?

Then go buy two G70s. You sound like the biggest fanboy in every post, despite your "I bought an X800.." guise.

How can it be late to the market? It can be used for every generation from hence forth, just like SLI one can assume, you did for Nvidia. Biased much?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Here is a latest preview where 2 x850xt pe actually beat 2 6800Ultras in doom 3. Pretty hard to believe.....Doom 3 XFire Bench given that they used old catalyst 5.4 drivers and not even 5.6 ones that claim to raise doom 3 performance even more.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: RussianSensation
Here is a latest preview where 2 x850xt pe actually beat 2 6800Ultras in doom 3. Pretty hard to believe.....Doom 3 XFire Bench given that they used old catalyst 5.4 drivers and not even 5.6 ones that claim to raise doom 3 performance even more.

within 1 or 2 FPS and the Ultra SLI takes one of them . . .
not bad for ATI's first try at "sli"
 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
Originally posted by: RussianSensation
Here is a latest preview where 2 x850xt pe actually beat 2 6800Ultras in doom 3. Pretty hard to believe.....Doom 3 XFire Bench given that they used old catalyst 5.4 drivers and not even 5.6 ones that claim to raise doom 3 performance even more.
DAMN! Those are some nice numbers for ATi's Crossfire! And Doom3 is ATi "weak" spot. I'd hate to see some HL2/Far Cry numbers. Those numbers should punch nVidia's SLI in the mouth. :)
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Yes, but the Ultra SLI rig takes the lead at the highest res with the most AA/AF (not by much), however this does imply that CrossFire may have less cpu overhead. Why are they comparing SLI and CrossFire at resolutions like 1024x768 and 1280x1024 anyway?
 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
Originally posted by: nitromullet
Yes, but the Ultra SLI rig takes the lead at the highest res with the most AA/AF (not by much), however this does imply that CrossFire may have less cpu overhead. Why are they comparing SLI and CrossFire at resolutions like 1024x768 and 1280x1024 anyway?

Yeah, you're talking about 1FPS behind in a game that's ATi's "weak" spot AND they were using older drivers. I'd say that is a BIG win for ATi!
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
My guess is that even Doom3 is cpu limited with either of these setups at the resolutions tested, so it's pretty inconclusive at this point. I'm not implying anything about the performance of CrossFire here, but more about the poor testing methodology. Or is it intentional to make CrossFire and SLI look "equal" in Doom3? Additonally, I would imagine that both SLI and CrossFire benefit the most (percentage wise) in games that they are not as strong with a single card. I don't imagine that HL2 will yield as much of an increase with CrossFire, since the X8x0 series already performs well there and they will most likely be cpu bound at all but the highest resolutions. Don't get me wrong, I bet playing HL2 on a CrossFire rig absolutely rocks. :)

edit: riddled with typos...
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Todd33
Originally posted by: Rollo
Where I think the niche for this will be is the X800XL AMR.

Who will pay $570 for a X850 master and $400+ for an X850 when they can get two G70s for a little more?

Then go buy two G70s. You sound like the biggest fanboy in every post, despite your "I bought an X800.." guise.

How can it be late to the market? It can be used for every generation from hence forth, just like SLI one can assume, you did for Nvidia. Biased much?

Did I say anything about SLId R520s? No.

Did I say bringing SLI for SM2 chips to market one year after SM3 has been the MS standard is too late? Yes.

G70s are supposedly being built now, according to some sources. If they offered comparable performance to Crossfire, and cost much less, wouldn't you say SM2 Crossfire is a little late to market if it's competitor costs much less, offers similar performance, and has a more advanced feature set?

If this would hav e come out last November, it would have been worth discussing. IMO the X800XL is the only thing of note here, because like the 6800U/X850XT PE conflict, even if it has a little better frames are you going to buy it over 6800U SLI today and G70 sli later this month?

Why? So you can play Splinter Cell and who knows what else in SM1.1?

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: 0roo0roo
BTW- my point here is that 16X12 isn't a "middling resolution"

it is. those dell 2405FPW widescreens filtering down to average gamers/users now are already 1920 x 1200. sli is for high end gamers/users.

Hmmm. They'll never make a living selling to the couple hundred people that have 2405s now will they?

In any case, as a guy who owns two sli sets and used to have another, I think I know "who sli is for". (apparently- me)


 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Yeah, I like how this argument can be swayed from one end to the other...

First 1600x1200 is "middling", then CrossFire "beat" SLI in Doom3 because it leads by 1.6 FPS @ 1024x768 no AA/AF. Nevermind that SLI pulls ahead by 1.2 FPS @1280x1024 4xAA/8xAF, which I'm sure would only grow @ 1600x1200. But hey, 1600x1200 is "middling", so we won't worry about that.
 

Rock Hydra

Diamond Member
Dec 13, 2004
6,466
1
0
Originally posted by: Greenman
Originally posted by: Rollo
Where I think the niche for this will be is the X800XL AMR.

Who will pay $570 for a X850 master and $400+ for an X850 when they can get two G70s for a little more?

So around a thousand bucks for video. Let me say that again, One Thousand Dollars for video. I can see three reasons for spending that much;
1. You use it to make a living, and more speed means less hours on a project.
2. You make a lot of money without working very hard, so you skip those new dolphin skin boots with the Bald Eagle feathers so you can afford the cards.
3. You're a pale friendless virgin with an empty life and nothing else to do, so you buy new video cards instead of having your teeth fixed.

Yeah, that's almost as ridiculous as paying 800-1000 dollars for an Athlon 64 FX or Pentium (4/X) Extreme Edition
 

Steelski

Senior member
Feb 16, 2005
700
0
0
edit: think logically what SLI - 2 will do . . . give gamers what they want:
1) ability to use two cards with different BIOSes and maybe different generations
2) Supertile and other refinements that ati is using
3) Allow for SLI changes on-the-fly without rebooting
4) OF course, 16 lanes PCIX for EACH card
5) Multimonitor support with sli and address all the other complaints

of course ATI will follow with AMR-2


I dont know who said this as it has been copied so many times.....but its really really funny as basically its all crappy.

the following numbered points correspond to what is said in the coment.
1) Ati will do that in this gen crossfire
2) Yes, folloowing Ati
3) Ati this gen crossfire, also will work on any system with dual graphics card interface and writen drivers for it.
4) This is not a SLI2 feature as it already exists in tumwater and nforce 4 profecional boards. ie. the dual opteron board with 2x16 lane sli capable slots.
5) expected multimonitor support without any trouble.

Infact this list was so wrong that i think that it could have been writen as a joke for people like myself to misunderstand.

SLI2 is with us. it goes by the name crossfire.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Steelski
edit: think logically what SLI - 2 will do . . . give gamers what they want:
1) ability to use two cards with different BIOSes and maybe different generations
2) Supertile and other refinements that ati is using
3) Allow for SLI changes on-the-fly without rebooting
4) OF course, 16 lanes PCIX for EACH card
5) Multimonitor support with sli and address all the other complaints

of course ATI will follow with AMR-2


I dont know who said this as it has been copied so many times.....but its really really funny as basically its all crappy.

the following numbered points correspond to what is said in the coment.
1) Ati will do that in this gen crossfire
2) Yes, folloowing Ati
3) Ati this gen crossfire, also will work on any system with dual graphics card interface and writen drivers for it.
4) This is not a SLI2 feature as it already exists in tumwater and nforce 4 profecional boards. ie. the dual opteron board with 2x16 lane sli capable slots.
5) expected multimonitor support without any trouble.

Infact this list was so wrong that i think that it could have been writen as a joke for people like myself to misunderstand.

SLI2 is with us. it goes by the name crossfire.

no joke . . . as if you'd 'get it' anyway. :p
:roll:

:thumbsdown: