• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Crossfire previews

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Jun 14, 2003
10,442
0
0
and so much for being selector card free....if u want to use all 16 lanes in one slot.....out comes the terminator card

liking the 14xAA though! wow!
 

ddogg

Golden Member
May 4, 2005
1,864
361
136
ya the 14XAA seems interesting....want to see how much the image changes
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
yeah a mix of multi and super sampling...sounds like delicious eye candy for your older games. Although who knows, it might end up making those older games looking TOO smooth and TOO sharp that they just don't look right :p

However I wouldn't buy it just for that, doesn't sound like it'll be friendly for newer games. Maybe Crossfired 520's...
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
It may be too little too late, but let's not forget that the technology will carry on to next gen and that will be good.

This seems like it is a better implementation than Nvidia's, but we'll need a full review to really see.

3. ... so you buy new video cards instead of having your teeth fixed.
That's probably me. I must game!

But I still won't buy into SLI or CF because I urn very little money. Vewwy wittew. Teeni Tiny. Specky.

you still have to fork out for only one of two cards just to run it. its not like you could with nvidia, buy a 6800GT, and then later add a second.
Ah, see, here is the thing my friend. Later 6800GT may have different BIOS. You in deep shitta. You buy two at same time, ya!

i dont get the bit about the fact the crossfire card and normal card share system ram? why? they have to consume some actual main memory to work properly? wonder how this will effect things
Over Nvidia's SLI, Crossfire will consume more CPU cycles, how much more we will see.

and so much for being selector card free....if u want to use all 16 lanes in one slot.....out comes the terminator card
I like the terminator. The Nvidia way seems a bit clumsy and annoying to work with, although I haven't actually played with it, from pictures, that's what it looks like. And the terminator helps protect the unused slot from dust as a bonus. Cool huh?!

At least ATI is actually pushing the AA envelope, not that it means much to me, but how many benchies did you see utilize Nv's 8xAA. I still haven't seen one.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Looks much better than sli, but so what? Lets just see the g70 and r520 - dual gpu is gauche.

:beer:
 

ddogg

Golden Member
May 4, 2005
1,864
361
136
Originally posted by: ronnn
Looks much better than sli, but so what? Lets just see the g70 and r520 - dual gpu is gauche.

:beer:

:beer:
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: Rollo
http://www.anandtech.com/video/showdoc.aspx?i=2432

http://www.beyond3d.com/reviews/ati/mvp/

http://www.hardocp.com/article.html?art=Nzc4

http://www.pcper.com/article.php?aid=146

As I see it:

Pros:
Compatibility
Ability to use existing X800 card if you have one
Perhaps performance
Super AA

Cons:
Too late- mid 2005 for a multichip SM2 set?!?!?
Compatibility- the version it will use on old games is tiling, which offers least benefit. On top of that, you don't really need SLI for old games.
G70 SLI will crush this before it gets to market.

To me it seems like "too little, too late". Time will tell.

Lol says the Nvidia fanboy, well a G70 may beat this in bechmarks but your comparing two separate generations, I could say the R520 will 'crush' that, we will wait and see.

Fact: It is a better feature set than Nvidia's
 

housecat

Banned
Oct 20, 2004
1,426
0
0
After reading it, I still prefer Nvidias method due to the internal card that offers 10GB/sec bandwidth and reduces the already 8x PCIE bandwidth usage to a minimum to function.
That, and the 1999 3dfx style SLI connector seems a bit less advanced for this day and age.. but I dont hold that against it too much as long as it performs.

I feel the SLI connector card is a big advantage, and a much better longterm solution than crossfire. But ATI can switch it up later, I'll just be glad I have a NF4 SLI board so I can run NV40-G70-G80 SLI (more than likely), while ATI will prob switch Crossfire to something built a little better for the long term.


I cant say I dont like Crossfire though yet. So far I actually like it. I do not want to buy an ATI motherboard though at any cost if possible, and the master card thing is bummer.. but on R520 this will be nice.

It seems to be a makeshift solution (but so far it performs nicely), but I'm worried about the future of this technology.

I think that Nforce SLI boards will be running NV SLI capable cards for a long, long time.
Its also a shame that so few are supported, no 9600 love? NV has 6600GT-6800Ultra support. They will have to remedy that.. but prob dont want to spend the money on R&D to support all their cards and do all the driver work.

We'll see.

But anyone saying it "looks much better than SLI" is crazy. We need a final review of it and in depth comparison.

I say: it looks decent, and "seems" to perform well. But longterm I dont like the solution (PCIE 8X bandwidth and hungry for it at that, with no internal connector card with high bandwidth) and I'm glad I went NF4 SLI considering and didnt wait for this.

I wouldnt trade up, from the information in this review.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Only now are cards making use of 8x AGP and there have been tests comparing 16x PCI-E to 8x and the differences have been small if any
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Right, point is the longterm viability.
Yes it will work on todays cards (although I dont really consider the X850 a "modern feature-set card") in todays games, but will Crossfire be good for even R520 or R620? There is doubt due to limited PCIE 8X bandwidth and the fact it actually uses not only the normal amount used, but even more due to their implementation.
As stated, I think this is their makeshift solution, its probably not goign to be any SLI killer. It will be changed.

Its just nice to have that 10GB/sec link in an SLI rig, you might be stuck with two 8X PCI slots but at least you are eleviating that stress off the system.
Its one less thing to worry about, for ATI to work with.. and for performance.

On R420 its fine (so far, in this skimpy preview), but R520? R620? Its makeshift guys. It can't last. Hence is prob no SLI killer.
I'm reserving my final judgement as I only have this small preview to go off of.

Seems the usual crew came out to draw conclusions that it is "definitely better than SLI", my God.. get over yourselves (and ATI). Thats fanboy talk. Lets talk about the technology, not create a big flamewar like the Canadian Crew here loves to do.
 

ddogg

Golden Member
May 4, 2005
1,864
361
136
Originally posted by: Sylvanas
Originally posted by: Rollo
http://www.anandtech.com/video/showdoc.aspx?i=2432

http://www.beyond3d.com/reviews/ati/mvp/

http://www.hardocp.com/article.html?art=Nzc4

http://www.pcper.com/article.php?aid=146

As I see it:

Pros:
Compatibility
Ability to use existing X800 card if you have one
Perhaps performance
Super AA

Cons:
Too late- mid 2005 for a multichip SM2 set?!?!?
Compatibility- the version it will use on old games is tiling, which offers least benefit. On top of that, you don't really need SLI for old games.
G70 SLI will crush this before it gets to market.

To me it seems like "too little, too late". Time will tell.

Lol says the Nvidia fanboy, well a G70 may beat this in bechmarks but your comparing two separate generations, im sure the R520 will crush that.

Fact: It is a better feature set than Nvidia's

well he may be an Nvidia fanboy but based on ur comments there u sure seem to be an "extreme" ATI fanboy....how do u knw the R520 will crush that and from where did u make the conclusion that it has a better feature set than Nvidia....?????
some pretty extreme fanboys these days :roll:
 

ddogg

Golden Member
May 4, 2005
1,864
361
136
Originally posted by: Sylvanas
Only now are cards making use of 8x AGP and there have been tests comparing 16x PCI-E to 8x and the differences have been small if any

always better to move forward rather than stay back...technology would never progress if ppl keep complaining about new tech
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
I dont get the excitement about 12xAA... unless you are running a $85 monitor that maxes out at 1024x768 on your $2000+ rig. At 1600x1200 4xAA there is NO aliasing at all, id shoot for 64xAF before any more AA.

Edit: Or hell, even hackjob free 16x AF.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: MegaWorks
Originally posted by: n7
I'm not a fan of SLI or Crossfire.

I'll keep upgrading my single card solutions thankyou very much.

Same here.

it isn't that i am not a fan of sli/amr - i am and i find where it is "going" very interesting . . . it just isn't practical for my needs. i still game on a 19" CRT at 11x8 [max] @ 85hz . . . there isn't a single game out that will even strain a top current card at these resolutions.

However . . . ati's solution is very interesting and i want to see thorough performance reviews before i decide if i like it.

As to being "too late" . . . that also depends . . . this solution will extend to r520 . . . . i think ati is just "testing" and waiting to see what nVidia puts on the table . . .

. . . even IF i had the money, i would NOT be an early adopter. IF it is successful, expect r520 AMR cards and these x800s to be discounted rather sharply and rapidly. ;)

"who will pay?". . . fanATIcs with lots of spendable money, of course.
:roll:

edit: i am predicting/guessing that AMR'd XT's will slightly edge out SLi'd Ultras or ATi needn't have bothered . . . then some ati fans will JUMP on them to beat the nVidia fans and then nVidia will release the G70 7800Ultra which will be immediately be sli'd to proclaim nVidia king at which time ati will respond with the AMr'd RxXT-PE and will have crappy OGl support and ati fans will bitch about nVidia's IQ and nv fans will accuse ati of cheating and . . .
:confused:

i am getting me a G-D xbox :p
:disgust:

goodnight!
:thumbsup:

:D
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: bunnyfubbles
yeah a mix of multi and super sampling...sounds like delicious eye candy for your older games. Although who knows, it might end up making those older games looking TOO smooth and TOO sharp that they just don't look right :p

However I wouldn't buy it just for that, doesn't sound like it'll be friendly for newer games. Maybe Crossfired 520's...

This is old news for nV40 owners. You can use nHancer or rivatuner to force 16x AA (2x2 SS + 4x MS).

It doesn't make things look too smooth or sharp, just beautiful IQ.

It's about time ATi FINALLY (R300 launched in august 2002 and mac versions or R300 & higher have always had SSAA enabled) gave their users the option to use SSAA, shame you have purchase their SLI to do it though.

 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: Gstanfor
Originally posted by: bunnyfubbles
yeah a mix of multi and super sampling...sounds like delicious eye candy for your older games. Although who knows, it might end up making those older games looking TOO smooth and TOO sharp that they just don't look right :p

However I wouldn't buy it just for that, doesn't sound like it'll be friendly for newer games. Maybe Crossfired 520's...

This is old news for nV40 owners. You can use nHancer or rivatuner to force 16x AA (2x2 SS + 4x MS).

It doesn't make things look too smooth or sharp, just beautiful IQ.

It's about time ATi FINALLY (R300 launched in august 2002 and mac versions or R300 & higher have always had SSAA enabled) gave their users the option to use SSAA, shame you have purchase their SLI to do it though.

They (Ati/Nv) did not impliment it eailier because no cards could take the performance hit. The point of this is that you need not download and install a 3rd party program to force SSAA. If ATI has had SSAA for so long im sure it would be possible for someone to make a program that forces Radeons to use it, and offically Nvidia does not condone the use of SSAA otherwise it would be in their drivers by default.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
SSAA is in nVidia's drivers by default - 8x anti-aliasing, uses 2x MSAA, 2x2 SSAA. And the 16x option has always been in the drivers, just never publically exposed, just like coolbits isn't publically exposed, those who know how to activate such things find them quite usabale however. It's fairly obvious you are not running a nVidia graphics card or you would not be asserting that 16X AA is unusable for single cards...

On the topic of forcing ATi cards to use SSAA, I believe Demirug who posts at beyond3d is working on just such a utility. How far he has progressed I don't know, buthe occasionally comments on SSAA as it conscerns R300 and above in the forums.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
It's fairly obvious you are not running a nVidia graphics card or you would not be asserting that 16X AA is unusable for single cards...
8xAA is unusable on a single card except for very old games like GLQuake, Quake 2 and Half-Life.

For all intents and purposes you need an nVidia SLI setup (a pair of 6800GTs as a minimum) to use 8xAA at any reasonable speed.

And if 16xAA is half the speed of 8xAA then a pair of 6800Us will run it like like one 6800U will run 8xAA, which again is limited to very old games.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Just as a quick and trivial example, you can achieve 60 FPS in mafia @1600x1200 16x AA 16x AF HiQ, 76.45 XG drivers.

Naturally, older titles yield better results, but, ATi's crossfire is no different here, but bfg10k is over exaggerating the required age.

Other games I play at similar/identical settings include NOLF1/2, SOF2, Ghost Recon, MOHAA, Gothic1/2, Vampire Redemption, Hitman 2, GTA3, wolfenstein and more.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Sure, if you use a middling resolution like 1600x1200 you can get away with using it.

But why run at 1600x1200 when it's better to go higher? At 1920x1440 you can forget about using 8xAA unless you're playing very old games.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
While my monitor is good, very good and 1920x1440 is a supported resolution (only @60hz though, at that point I'm hitting the limots of the tube and the available monitor bandwidth), 1600x1200 is the most sensible resolution for my setup to play at. Current graphics cards are not that comfortable above 1600x1200 unless you want SLI and silly prices.

Personally I'll wait until graphics cards are capable of handling 2048x1536 with the same sort of speed as they currently do 1600x1200 then get a nice 21" CRT capable of comfortably handling the resolution. At the moment its too much money for too little gain.
 

TantrumusMaximus

Senior member
Dec 27, 2004
515
0
0
Originally posted by: Rollo
Originally posted by: Greenman
Originally posted by: Rollo
Where I think the niche for this will be is the X800XL AMR.

Who will pay $570 for a X850 master and $400+ for an X850 when they can get two G70s for a little more?

So around a thousand bucks for video. Let me say that again, One Thousand Dollars for video. I can see three reasons for spending that much;
1. You use it to make a living, and more speed means less hours on a project.
2. You make a lot of money without working very hard, so you skip those new dolphin skin boots with the Bald Eagle feathers so you can afford the cards.
3. You're a pale friendless virgin with an empty life and nothing else to do, so you buy new video cards instead of having your teeth fixed.

You're apparently not a boater/skier/traveller/auto hobbyist/golfer/cyclist etc..
There are many ways to spend this (or much more) on your hobby.
<thinks about $1000. just spent on trolling motor/two batteries for fishing boat this month>

Hey Rollo, don't even bother trying to get people to understand why people spend $ on a dual vid card solution....some will never get it. Wasting yer time.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
Sure, if you use a middling resolution like 1600x1200 you can get away with using it.

But why run at 1600x1200 when it's better to go higher? At 1920x1440 you can forget about using 8xAA unless you're playing very old games.

So says the man whose card can't even run some games at 16X12 4X8X......

BTW- my point here is that 16X12 isn't a "middling resolution" because for most CRTs it's as high as you go witha comfortable refresh rate, and I don't know of any LCDs with a higher default, so there must not be many.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Sylvanas
Originally posted by: Rollo
http://www.anandtech.com/video/showdoc.aspx?i=2432

http://www.beyond3d.com/reviews/ati/mvp/

http://www.hardocp.com/article.html?art=Nzc4

http://www.pcper.com/article.php?aid=146

As I see it:

Pros:
Compatibility
Ability to use existing X800 card if you have one
Perhaps performance
Super AA

Cons:
Too late- mid 2005 for a multichip SM2 set?!?!?
Compatibility- the version it will use on old games is tiling, which offers least benefit. On top of that, you don't really need SLI for old games.
G70 SLI will crush this before it gets to market.

To me it seems like "too little, too late". Time will tell.

Lol says the Nvidia fanboy, well a G70 may beat this in bechmarks but your comparing two separate generations, I could say the R520 will 'crush' that, we will wait and see.

Fact: It is a better feature set than Nvidia's

The point is "too little, too late". As G70s are supposed to be in stores before this, and R520s MIA, to me it seems anti-climactic.

So it's "nVidia is crushing ATI now, when this comes out nVidia will still be crushing ATI" for top performance.

Why would anyone pay $1000 for two SM2 ATI cards with a total of 32 pipes that they have to run on a motherboard Anand has called "buggy"* when they could buy two SM3 cards, second gen no less, with 48 pipes on comparatively mature motherboards for $1100 MSRP? I've got news for you: if you're spending $1000 on video cards, you want it to be fastest and last a while.

Going to gamble your $1000 that SM3, EXR HDR, and soft shadows don't matter any more than they already do over the life of this set Sylvanas?

I personally don't have lots of spare $1000s to gamble, and would be fairly pissed if I did and more SM3/SM1.1 only games came out like SC:CT?

*http://www.anandtech.com/video/showdoc.aspx?i=2432&p=8
While it's hardly talked about outside of Taiwan, ATI's South Bridge is quite buggy. The chip that is responsible for providing the motherboard's SATA and USB ports, as well as PCI slots is no where near final and many manufacturers are skeptical of ATI's ability to finish their own South Bridge in time. Note that ATI's own South Bridge does not support SATA-II or NCQ, regardless of actual bugs with the chip.
Not being able to run WS on some games at first, or Windows 2000 may well seem like an impossible paradise for these guys. I like how ATI contracted another firm to do the southbridge for these demos but expects their OEMs to use their inferior ones as well!
However, ATI is pushing most of their partners to use ATI's own South Bridge despite its problems and is convinced that the problems will be sorted out in time. So a number of manufacturers at Computex are showing off CrossFire solutions with ATI's South Bridge, despite their complaints to us about the South Bridge.



 

biostud

Lifer
Feb 27, 2003
19,936
7,041
136
Even though SLI and crossfire sounds sweet and had I unlimited funds I would go for one, but most likely I won't go for one.

More GPU power is ofcourse always good, but personally I doubt that I will notice visiual improvement using more than 4xAA, that could be considered worth the money.
Also the need for a very powerfull PSU (although just a one time payment) will add to the cost, and the cost of electricity.

However, if the crossfire actually will let you run a r520 256mb SM3.0 and a r6xx 512mb SM4.0? and not be 'limited' by the r520's lack of functions it sounds a lot better than SLI, IMHO. But it sounds too much as a modern fairytale.