Crossfire previews

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: Creig
Originally posted by: housecat
I am on the side of truth, Creig. I dont think R-own-o cares if I'm on his side or not, he is whipping you with one hand behind his back anyway.
I just keep score.


Hey housecat, since you like keeping score so much, how 'bout this one?

AnandTech Moderators - 2
housecat - 0

Now let's try keeping the thread on track. If you want to keep kissing up to Rollo, create a thread in OT about it.

ROFL ok ok.. but there still hasnt been a Creig 1, HC 0. Last time I checked, I owned you bigtime and you ran off and hid.
Funny how you attack the man, and not the content Creig old boy.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
I dunno. I think matched cards will result in on par performance and a nice feature set (I.E. X850 x2 vs. 6800U x2 will be comparable), but I think if you CrossFire with a slower card as one half (say, X850 master card and X800 or some such slower dealy) then you'll get some performance hampering. Is it more cost effective? Maybe slightly, but I think trying to put "cost effective" in the same sentence with "dual GPU solution" is just kind of silly.

There are a few other questions I'd like answered, like whether or not the passthrough cable degrades image quality, and exactly how "buggy" this buggy ATi southbridge is.

Plus, I think major testing needs to occur on ATi mobos. We know the Nforce line is damn solid, but ATi so far seems to have on par performance, but compatibility and features are yet to be determined...
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: housecat
Originally posted by: Creig
Originally posted by: housecat
I am on the side of truth, Creig. I dont think R-own-o cares if I'm on his side or not, he is whipping you with one hand behind his back anyway.
I just keep score.


Hey housecat, since you like keeping score so much, how 'bout this one?

AnandTech Moderators - 2
housecat - 0

Now let's try keeping the thread on track. If you want to keep kissing up to Rollo, create a thread in OT about it.

ROFL ok ok.. but there still hasnt been a Creig 1, HC 0. Last time I checked, I owned you bigtime and you ran off and hid.
Funny how you attack the man, and not the content Creig old boy.

If you have something to add to the topic of discussion, then please do. Otherwise, take it to OT.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
PFFT! Creig playing holier than thou now.
GET OVER YOURSELF. You are not a mod. Just because you wont answer to Rollo, and wont admit you were destroyed by me in the past doesnt mean you can point in the other direction and act like you matter.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
God would you two just shut up?
No, seriously, would you please?


This is the stereotypical situation that gets made fun of in the Simpsons or Family Guy for chrissakes. The two nerds on opposite ends of a connection, pushing their taped glasses back into position and arguing intricacies of forum history that no one, aside from you two, give's a flying cow sh*t about. Nor do these intricacies have any impact in your actual lives, assuming they exist, off this forum. You are literally wasting your lives with this activity. Wasting them, and what's worse, forcing it on the rest of us. Trust me, you do not respawn, so wastefulness of life is rather shortsighted.

The rest of us would like to continue mulling over the small tidbits of bait that the IHVs at Computex have thrown to us. If you want to keep arguing about who owned who and otherwise flogging each other over the cranium with your inflated e-dicks, take it to OT. That's what it's there for. You two look so damn silly it's ridiculous. How old are you again?

Either way, just silence your head holes. No one else here wants this crap.
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Thank you Insomniak all of us who want to see a civil disscussion of the pros and cons of crossfire owe you a debt of gratitude.
 

cbehnken

Golden Member
Aug 23, 2004
1,402
0
0
Originally posted by: Genx87
They taunt the lack of having to use a card to get their dual card config working. But if you want to configure the thing to not use two cards and have all 16 lanes you do.

Seems like they reversed Nvidias idea and I am not terribly impressed with their plan. Basically took the lowest denominator and went with it.

Instead of a connector on the inside of the case we are going back to 1997 with an external VGA or I guess the 2005 version of it an external DVI connector?

This is supposed to be an improvement? /yawn

AFR doesnt seem like a terribly great idea. I can see increased compatibility but I dont see how the performance increase will be as great as split screen. If there is a particular area of a scene that is a bottleneck. Instead of allowing the scene to scale between the cards and allow for the maximum performace from each you are stuck with both cards having the same bottleneck and thus a lowered performance.

Split screen looks like the best plan at the moment and it is good they are including it.

The 32x32 checkerboard just seems like it will have high overhead compared to the other two rendering options.

The pro of being able to run the card with a cheaper version seems like a gimick. Who is going to go get an x300 and then this? Total waste of money because the clock + pixel units are disabled.

Now all of this will of course be forgotten if ATI can get the thing to whip Nvidias solution. But as it stands right now it is late and nothing special from what I can tell.

1. Umm, SLI uses AFR too in some cases.

2. The crossfire setup *should* work in NF4 boards with 2 x16 slots, like my ultra-d. Anand even said it was very likely. If it is true it'll be great for everyone.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: fierydemise
Thank you Insomniak all of us who want to see a civil disscussion of the pros and cons of crossfire owe you a debt of gratitude.

believe me, they aren't listening :p
:roll:

just thinking of somemore fanboy crap to top the other . . .

all judgements are PREmature . . . ATi's Xfire may just blow nVidia's sli out of the water - or not . . . i tend to think 'not'.

Nor have we seen xfire "bugs" . . . anyone who expects ATi's feature set to be bug-free is mad [or a true fanATIc] . . . imo, the Southbridge is 'no problem' . . . ati's partners know better than to use ati current bug-ridden bridge - and there is a great alternative - ati is not "forcing" anything on them.

No, it's not too little or too late . . . maybe too expensive, but that will depend on current x800series owners . . .

again - imo - ati is "using" it's most devoted fans to work out the AMR bugs [as nVidia did with sli] . . . . THEN - IF successful [and after nVidia brings g70 to the table0 we'll see sli'd r520 . . . at THAT point, the majority of us will decide

(to get a damn xbox360)
:shocked:
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Insomniak
God would you two just shut up?
No, seriously, would you please?


This is the stereotypical situation that gets made fun of in the Simpsons or Family Guy for chrissakes. The two nerds on opposite ends of a connection, pushing their taped glasses back into position and arguing intricacies of forum history that no one, aside from you two, give's a flying cow sh*t about. Nor do these intricacies have any impact in your actual lives, assuming they exist, off this forum. You are literally wasting your lives with this activity. Wasting them, and what's worse, forcing it on the rest of us. Trust me, you do not respawn, so wastefulness of life is rather shortsighted.

The rest of us would like to continue mulling over the small tidbits of bait that the IHVs at Computex have thrown to us. If you want to keep arguing about who owned who and otherwise flogging each other over the cranium with your inflated e-dicks, take it to OT. That's what it's there for. You two look so damn silly it's ridiculous. How old are you again?

Either way, just silence your head holes. No one else here wants this crap.


Neither do I. Before you lump me in with housecat, go back up a few posts and see who started the flaming. Me, or the guy who's been on Moderator sponsored vacations for a total of 1 1/2 months out of the 7 months he's been registered!

I posted in this thread to try and give an alternate viewpoint of the ATI Crossfire system. Not to have to defend myself from his infantile verbal attacks. He can take his crap to OT for all I care and I've asked him to do so already.

I would be totally happy to see this thread stay on topic, but he seems to be unable to control himself.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
I'm a single card guy myself. If G70 performs better, I'm on it. If 520 is better I'm on that... rinse and repeat 18 months later.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Creig
If you want to keep kissing up to Rollo, create a thread in OT about it.

No one should "kiss up" to me except Mrs. Rollo, and perhaps Denise Richards should Mrs. Rollo ever wise up and send me packing for spending too much time in our rec room posting on forums and drinking bourbon.

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: nitromullet
Besides, nVidia isn't known for sitting on their hands, I would not be surprised at all if nVidia had SSAA options avaialble for SLI in a a beta driver before ATi even gets Crossfire out the door... I don't think that this is a matter of it not being possible with SLI, but more that nVidia didn't think of it as a feature that would be desired by gamers.

I notice Ben has started a topic on this subject already also, but, I'd encourage you to read my earlier posts in this thread. nVidia has always supported SSAA, even with the GeForce-FX, and it works just fine with SLI. The only thing ATi is doing different is the 1/2 pixel jitter on each card,, but I'm sure nVidia can come up with something similar (maybe only for G70 though).
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: Creig
Originally posted by: Insomniak
God would you two just shut up?
No, seriously, would you please?


This is the stereotypical situation that gets made fun of in the Simpsons or Family Guy for chrissakes. The two nerds on opposite ends of a connection, pushing their taped glasses back into position and arguing intricacies of forum history that no one, aside from you two, give's a flying cow sh*t about. Nor do these intricacies have any impact in your actual lives, assuming they exist, off this forum. You are literally wasting your lives with this activity. Wasting them, and what's worse, forcing it on the rest of us. Trust me, you do not respawn, so wastefulness of life is rather shortsighted.

The rest of us would like to continue mulling over the small tidbits of bait that the IHVs at Computex have thrown to us. If you want to keep arguing about who owned who and otherwise flogging each other over the cranium with your inflated e-dicks, take it to OT. That's what it's there for. You two look so damn silly it's ridiculous. How old are you again?

Either way, just silence your head holes. No one else here wants this crap.


Neither do I. Before you lump me in with housecat, go back up a few posts and see who started the flaming. Me, or the guy who's been on Moderator sponsored vacations for a total of 1 1/2 months out of the 7 months he's been registered!

I posted in this thread to try and give an alternate viewpoint of the ATI Crossfire system. Not to have to defend myself from his infantile verbal attacks. He can take his crap to OT for all I care and I've asked him to do so already.

I would be totally happy to see this thread stay on topic, but he seems to be unable to control himself.

You REALLY dont know how to shut your mouth do you Creig? JUST LISTEN TO INSOMNIAK DINGLEBERRY. I CAN ACCEPT HIS CRITISM, HES RIGHT! NOW SHUT YOUR DAMN MOUTH TROLL!
*sigh*










Anyway, back on topic..

It would benefit ATI GREATLY if Crossfire can operate on Nforce SLI boards. Might sell some then. Might get me to pop in Crossfire'd R520s.

But if I have to use an ALI/ULI anything to use it.. forget it. I'll stick with Nforce or Intel thanks.

I wouldnt use ALI/ULI for ANY dual card setup. As I wouldnt use it period in my rigs.
And if ALI/ULI is better than ATI chipsets, that bears ill future for ATI chipsets then who would use an ATI ever? I dont know what planet you would have to be from..
planet fanboy I guess.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Originally posted by: nitromullet
Besides, nVidia isn't known for sitting on their hands, I would not be surprised at all if nVidia had SSAA options avaialble for SLI in a a beta driver before ATi even gets Crossfire out the door... I don't think that this is a matter of it not being possible with SLI, but more that nVidia didn't think of it as a feature that would be desired by gamers.

I notice Ben has started a topic on this subject already also, but, I'd encourage you to read my earlier posts in this thread. nVidia has always supported SSAA, even with the GeForce-FX, and it works just fine with SLI. The only thing ATi is doing different is the 1/2 pixel jitter on each card,, but I'm sure nVidia can come up with something similar (maybe only for G70 though).


Here you go . . . Nvidia starts working on SLI 2

Nvidia starts working on SLI 2
even if Crossfire defeats SLI, Nvidia has some secret horses for a new race. It is working on something that we know as SLI 2.

As you know SLI, has been around for a few quarters now and Nvidia is working to improve this marchitecture. One of the teams was working to get some better silicon at the same time and that?s what we believe is going to be called SLI2.

We learned that Nvidia is working on a motherboard that will have more PCIe lanes and might actually get close to two times 16 PCIe lanes, the ultimate for PCIe graphic cards in SLI mode.

We are not sure how much two times PCIe 16 times lanes will change the actual SLI score but this is something that we need to see before making our own mind . It could be good thing you never know.

Nvidia will also improve number things on its new silicon once it polishes it enough as it learned from its first born SLI chipset. We don?t have any idea about timing of final specification and features of such SLI 2 chipsets and boards but we will work on it. µ

:D
 

housecat

Banned
Oct 20, 2004
1,426
0
0
I suspect SLI2 isnt much more than improved motherboards with dual 16X PCIE slots under SLI mode.

I dont see where else you could improve things.. most would be the cards SLI function/communication themselves and drivers.

They got it pretty much 100% correct first time out. ATI needs a change, theirs is cumbersome and not the ticket.. I think the Inq even stated that ATI plans on changing their AMR later anyway. They had to rush this out to appear competitive.

The dongle, the ATI/ALI motherboards (I dont know which chipset is really worse!), the heavy PCIE bandwidth usage, master cards.. I'm not sold yet. Then tack on all the inherent things with any dual card rig, need better PSU, decent case cooling ect.
Its not a horrible solution as it APPEARS to perform, but nothing I'd swap out my Nforce4 for a ALI/ATI board, and lose my DX9C 6800GTs to have by any means.
I need a final review to pass judgement for sure.

But I will refute anyone who says its "definitely better than SLI".. u should be shot.
Or just be forced to use a ATI/ULI motherboard.
But the first step to being better is actually existing in someones rig somewhere.. :thumbsup:
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I know, Apoppin, I mentioned this and provided my interpretation in this thread

That article doesn't mention anything about AA refinements (which nVidia is widely believed to be making with G70), just PCI-e lanes.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
I know, Apoppin, I mentioned this and provided my interpretation in this thread

That article doesn't mention anything about AA refinements (which nVidia is widely believed to be making with G70), just PCI-e lanes.

that thread was about r520 vs. g70 - i didn't expect to find anything about sli there [i didn't read it until now - thanx]

The Inq doesn't know ANYthing about SLI2 . . . but you can be SURE it will have ALL the refinements that AMR has (and what nVidia feels gamers want). ;)

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
1600x1200 is the most sensible resolution for my setup to play at.
Well that's fine but it also probably means that SLI is not really a good option for you.

Current graphics cards are not that comfortable above 1600x1200 unless you want SLI and silly prices.
That's just plain false. I use 1920x1440 with 16xAF and 4xAA on pretty much any game older than 18-24 months.

So says the man whose card can't even run some games at 16X12 4X8X.....
And how many games do you run at 1920x1440 Rollo?

BTW- my point here is that 16X12 isn't a "middling resolution"
Yes it is. The likes of the 9700 Pro made it a standard and that was 2-3 years ago.

Why would anyone pay $1000 for two SM2 ATI cards with a total of 32 pipes that they have to run on a motherboard Anand has called "buggy"*
Of course nVidia's SLI and general driver bugs are a non-factor, right?

with 48 pipes
48 pipes? Show me where the official specs are for the G70 Also show me where it says the G70 can be SLI'd. Until you do your comments are nothing more than speculation.

If I were you I'd be more concerned about the space-heater the G70 looks like it's going to be. Having two of them in there, well time to break out the 700 W PSU.

Not at all. ATI has sold me the R300 core three times now, 9700P/9800P/X800XTPE, they won't be selling it to me again, that's for sure
Yet you're happy with the getting the NV4x core again with the G70. Interesting.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: BFG10K
Also show me where it says the G70 can be SLI'd. Until you do your comments are nothing more than speculation.
You don't think the g70 will be sli'd?
:shocked:

that is way beyond speculation to the point of ridiculous. . . clearly it will be sli'd
 

imported_Ged

Member
Mar 24, 2005
135
0
0
Originally posted by: otispunkmeyer
hmm after reading that i still prefer the way nvidia does it

i mean come on, they have a dedicated bridge for the two cards to communicate, and then have dedicated silicon just for that comunication. ok it only works with like cards with like bios' but is this any better? you still have to fork out for only one of two cards just to run it. its not like you could with nvidia, buy a 6800GT, and then later add a second. if you have a x800xt you have to go pay a premium for a master card to do it.

and so much for running any card! itll only be x800 and x850 to start with, and running a x800xt with a x300 is just a no go......(not that youd want to anyway, but people did pimp this fact) the drivers adjust clock speeds to similar levels then makes the number of pipes the same, basically youd have a 8 pipe set up there.

not feelin much love from the ghetto external cable connection either lol, but each to their own

but i am intrigued by being able to support up to 6 displays if the mobo maker inculdes integrated graphics....which is neat

also G70 will be a single slot solution....which actually suprises me. lets hope nvidia arent pushing the envelope on this, i dont wanna card thats a few degrees away from becoming an inferno

Well, Here's what I have been thinking:

If you SLI two dual slot cards the space between them is very small which doesn't let much air flow easily into the first card. People have posted time and again about their first card being a few degree hotter than the first because of this setup.

If you SLI two single slot cards the space between them will be greater because the motherboard manufacturers have assumed you will at most be using dual slot cards. Greater space between the two cards because of them being single slot solutions might actually let more air flow to the first card.

Single slot solution with a better fan and better heat sink design would probably be the best all around for an SLI setup because of how the two cards are installed.

This all gets ripped to pieces when compared with an aftermarket cooling solution, but I am talking about stock cooling options.

I really wish there was a way to put two cards in SLI back to back rather than front to back (spooning?). You could design some really nice cooling solutions for something like that.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
that is way beyond speculation to the point of riudiculous. . . clearly it will
You sure about that? What if it has dual slot cooling?

I don't know as I don't have the official specs.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: BFG10K
Not at all. ATI has sold me the R300 core three times now, 9700P/9800P/X800XTPE, they won't be selling it to me again, that's for sure
Yet you're happy with the getting the NV4x core again with the G70. Interesting.
From what I've read, RSX/G70 is a completely new core.. needs new drivers basically from scratch ect.
The NV40 was totally scrapped. Mainly cuz the Sony/NV partnership was the biggest boon any GPU firm could have gotten financially and for the exceptional fabs at their disposal.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: Rollo
Originally posted by: Creig
If you want to keep kissing up to Rollo, create a thread in OT about it.

No one should "kiss up" to me except Mrs. Rollo, and perhaps Denise Richards should Mrs. Rollo ever wise up and send me packing for spending too much time in our rec room posting on forums and drinking bourbon.



This is why I love having you around.


I've actually considered starting a book entitled "Dorks, and the women who endure them"

I think, to be honest, I've got a shot at the NYT Bestseller List.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
<a target=_blank class=ftalternatingbarlinklarge href="http://www.newegg.com/OldVersion/app/ViewProductDesc.asp?description=17-103-714&depa=0 ">well time to break out the 700 W PSU.
</a> <<< exactly why dual GPU is a non starter for me. Noise heat and power.. Even my GT@ 445/1200 puts serious strain on enermax 651 with a 36A 12volts rail frying my true 480 before that. (real smoke even)