X1800XT crossfire vs 7800GTX sli (updated with price comparison)

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: keysplayr2003
Originally posted by: Paratus
You know the other thing thats going to irk some NV folks is when the Saphire PE Edition hits the market at 700/1600 and brings in 3DM05 scores of ~9900 beating the 512MB GTX they are going to claim the "fastest" gaming card (even though you can't play 3dm05) :)

Yes, I have noticed that ATI cards usually score higher in 3dmarkXX than equivalent NV cards, yet actual game benches usually tell a different story. ATI gears for 3Dmark I believe, more so than NV does. (I know they both do).


I dont know about that. The 512MB GTX scores higher than a standard clocked XT, and the XT scores higher than a 256MB GTX, the 7800GT scores higher than a XL, which is pretty much what all games show too.

Not that I care about 3dmark, but this overclock for a X1800XT is pretty good. This is with the PE bios flash, and the core on a water block; http://xs60.xs.to/pics/05516/11067.jpg
 

Paratus

Lifer
Jun 4, 2004
17,561
15,676
146
Originally posted by: Rollo
Originally posted by: Paratus
You know the other thing thats going to irk some NV folks is when the Saphire PE Edition hits the market at 700/1600 and brings in 3DM05 scores of ~9900 beating the 512MB GTX they are going to claim the "fastest" gaming card (even though you can't play 3dm05) :)


Why would nV folks care if ATI puts out a card that is twice as fast, let alone a little faster?

If people are smart, they'll buy the card that serves their needs the best and leave "allegiance" to video card companies to those who are VERY immature?
:confused:

Speaking as one person you're probably considering a "NV folk", I'll buy whatever suits my needs best next time I have the "upgrade itch", or whatever looks interesting at the time.

I hope ATI puts out a card that's 10X faster than anything else, and costs $5.00.

wrapping my wifes gifts. be back later to answer.

 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I'm pretty sure. The 7800GTX scores around @7800 in '05. The XT gets @9200. Pretty big difference, yet the 7800GTX and XT trade blows in games. Then there is the GTX512 that does not score much higher (at @9500) than the XT in '05, yet the games performance Delta is pretty large.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Paratus
wrapping my wifes gifts. be back later to answer.

:laugh:

Going to dad's house for Christmas. Will be back later to read "OMG Rollo, you know this would anger a fanboy like you who would never buy an ATI card!"

<conveniently forgetting that 2005 was the first year in over a decade I didn't, and it's not my fault they launched the damn thing 10 months into the year after I'd already bought eight nVidia cards>
 

Alexstarfire

Senior member
Jul 25, 2004
385
1
76
I find it kind of funny that Crossfire did so poor in Black and White 2. I mean, it was ATI that was dissing nVidia about how not every game supports SLI, yet their (ATI's) method would work on every game available, new and old. Seems ATI lied again, temporarily at least. Well, since this is well over a year late with ATI still having availability problems on lauch day there is no excuse for is to not be as good, or better, than SLI. The fact that Black and White 2 messed up that bad with Crossfire proves that ATI is just having problems that they can't deal with.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: keysplayr2003
I'm pretty sure. The 7800GTX scores around @7800 in '05. The XT gets @9200. Pretty big difference, yet the 7800GTX and XT trade blows in games. Then there is the GTX512 that does not score much higher (at @9500) than the XT in '05, yet the games performance Delta is pretty large.
Not quite - the xt has improved substantially with the latest drivers, and in many cases is far ahead of the 256gtx - just look at the latest xbitlabs benches. Also, while the 512gtx is ahead in most games, the xt is a lot closer to the 512gtx than to the 256gtx in performance when the settings are cranked up. But Ati cards do have an advantage in 3dmark due to its heavy dependance on geometry processing. Both the the x1800 and the gtx have 8 vertex pipes, but the xt is clocked much higher, and thus it can process the geometry faster, which gives it an advantage in 3dmark. In general, however, cards than score higher in 3dmark also score higher in games, so the benchmark is still somewhat accurate.
 

chilled

Senior member
Jun 2, 2002
709
0
0
Originally posted by: munky
In general, however, cards than score higher in 3dmark also score higher in games, so the benchmark is still somewhat accurate.

That has to be one of trhe most obvious statements I've ever read. If that didn't hold then 3dmark would be useless in measuring gaming performance. Although it holds no real world resemblence, it follows that more powerful graphics cards will score better in games and 3dmark by varying amounts.

TBH, I don't use it for any more than benchmarking perf. over many months after I have installed XP and 'feel' its running slow.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: munky
Not quite - the xt has improved substantially with the latest drivers, and in many cases is far ahead of the 256gtx - just look at the latest xbitlabs benches.

Four things to consider when considering XBit benches:

1. XBit only tests at 16X AF. There's nothing wrong with this, but nVidia cards do take a higher hit on AF than ATI cards from what I've seen, and the 8X AF other sites use will bring the two closer. (all I'm saying here is that not everyone plays at 4X16X on all games)

2. XBit uses reference clocked 430/600 7800GTXs, which cost less than X1800XTs, rather than factory OCd 490/1300 cards such as the Leadtek, XFX, and Asus which generally cost the same as X1800XTs and perform about 10% better in some games.

3. The reference 256MB 7800GTXs being compared to came out five months before the X1800XT, and aren't really it's competition.

4. The 7800GTXs are a FAR better choice for multi GPU. More established/proven motherboards, better compatibility, more flexibility of settings, less money, no hideous and hard to work with monster cables in back, less noise/ heat/ power, no checking every boot to see if it's enabled, etc etc etc..

The X1800XTs are very nice cards, but there are things to consider that are not noted in Munky's post.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: Rollo
4. The 7800GTXs are a FAR better choice for multi GPU. More established/proven motherboards, better compatibility, more flexibility of settings, less money, no hideous and hard to work with monster cables in back, less noise/ heat/ power, no checking every boot to see if it's enabled, etc etc etc..

I agree 98% expect for the bit on checking on everybooth... anyways I Would really wait for R580 + new crossfire motherboards
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
Originally posted by: Rollo
Originally posted by: munky
Not quite - the xt has improved substantially with the latest drivers, and in many cases is far ahead of the 256gtx - just look at the latest xbitlabs benches.

Four things to consider when considering XBit benches:

1. XBit only tests at 16X AF. There's nothing wrong with this, but nVidia cards do take a higher hit on AF than ATI cards from what I've seen, and the 8X AF other sites use will bring the two closer. (all I'm saying here is that not everyone plays at 4X16X on all games)

2. XBit uses reference clocked 430/600 7800GTXs, which cost less than X1800XTs, rather than factory OCd 490/1300 cards such as the Leadtek, XFX, and Asus which generally cost the same as X1800XTs and perform about 10% better in some games.

3. The reference 256MB 7800GTXs being compared to came out five months before the X1800XT, and aren't really it's competition.

4. The 7800GTXs are a FAR better choice for multi GPU. More established/proven motherboards, better compatibility, more flexibility of settings, less money, no hideous and hard to work with monster cables in back, less noise/ heat/ power, no checking every boot to see if it's enabled, etc etc etc..

The X1800XTs are very nice cards, but there are things to consider that are not noted in Munky's post.

1. Well to be fair, if yuo are gonig to spend 1k for graphics, might as well go 16x af. I mean, you hive the power to take 16x even if it's a slightly bigger hit, so why not anyway.

2. Valid argument

3. But then what is the direct competition? The 7800gtx512s arent technically x1800xt direct competition either, being in such short supply and with the ihgh prices and all. Does anyone do Evga 7800gtx KO Sli benchmarks? That seems to be a fair comparo.

4. Not sure wjhat you mean by better compatibility, i always thought settings were suimilar. I mean there's half screen split, every other frame and ati has the horrid tiles thing. then there's 16xaa and 14xaa and transperency/adapative aa. not realyl much of a difference. The cables you will never have to bother with or see after install so I dont understand why everyone hates them. Heat and power requirements is valid and so is maturity. What is this checking every boot? I'm not all that familioar with Xfire but what's this reboot check? Does it turn itself off every startup?

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: tuteja1986
Originally posted by: Rollo
4. The 7800GTXs are a FAR better choice for multi GPU. More established/proven motherboards, better compatibility, more flexibility of settings, less money, no hideous and hard to work with monster cables in back, less noise/ heat/ power, no checking every boot to see if it's enabled, etc etc etc..

I agree 98% expect for the bit on checking on everybooth... anyways I Would really wait for R580 + new crossfire motherboards

Man, I could swear I saw something that said ATI recommends checking to see if it's active before playing, I'll look later. Time to take my son ice fishing. :)
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: mwmorph
Originally posted by: Rollo
Originally posted by: munky
Not quite - the xt has improved substantially with the latest drivers, and in many cases is far ahead of the 256gtx - just look at the latest xbitlabs benches.

Four things to consider when considering XBit benches:

1. XBit only tests at 16X AF. There's nothing wrong with this, but nVidia cards do take a higher hit on AF than ATI cards from what I've seen, and the 8X AF other sites use will bring the two closer. (all I'm saying here is that not everyone plays at 4X16X on all games)

2. XBit uses reference clocked 430/600 7800GTXs, which cost less than X1800XTs, rather than factory OCd 490/1300 cards such as the Leadtek, XFX, and Asus which generally cost the same as X1800XTs and perform about 10% better in some games.

3. The reference 256MB 7800GTXs being compared to came out five months before the X1800XT, and aren't really it's competition.

4. The 7800GTXs are a FAR better choice for multi GPU. More established/proven motherboards, better compatibility, more flexibility of settings, less money, no hideous and hard to work with monster cables in back, less noise/ heat/ power, no checking every boot to see if it's enabled, etc etc etc..

The X1800XTs are very nice cards, but there are things to consider that are not noted in Munky's post.

1. Well to be fair, if yuo are gonig to spend 1k for graphics, might as well go 16x af. I mean, you hive the power to take 16x even if it's a slightly bigger hit, so why not anyway.

2. Valid argument

3. But then what is the direct competition? The 7800gtx512s arent technically x1800xt direct competition either, being in such short supply and with the ihgh prices and all. Does anyone do Evga 7800gtx KO Sli benchmarks? That seems to be a fair comparo.

4. Not sure wjhat you mean by better compatibility, i always thought settings were suimilar. I mean there's half screen split, every other frame and ati has the horrid tiles thing. then there's 16xaa and 14xaa and transperency/adapative aa. not realyl much of a difference. The cables you will never have to bother with or see after install so I dont understand why everyone hates them. Heat and power requirements is valid and so is maturity. What is this checking every boot? I'm not all that familioar with Xfire but what's this reboot check? Does it turn itself off every startup?


1. I agree, using the highest AF is a good thing. Afterall, who is going to spend $1000+ and not try to play at the best settings? Some people like to brag about gaming at resolutions above 1600x1200, but then dont like 16xAF results being used?

2. I dont have a problem with a 430/1200 card being used. I would like a higher clocked card being used as well. This benefits the readers more, they can determine if the extra cost for an overclocked card is worth the additional fps increase, or not.

3. I too agree that saying that the 256MB GTX is not the XT's competion is silly. They are the same MSRP. The XT was supposed to be out at about the same time as the 256MB GTX. The 512MB GTX costs at least $250 more than the XT. And the 512MB GTX is not available. How anyone can say that the 512MB GTX is the XT's competition, and not the 256MB GTX is beyond me.

4. I do agree that right now, SLI is the easier/better multi-GPU setup. Going by reviews, I have not got a chance to play with Crossfire. However, making such a big deal about the dongle is silly to me. You plug it in, and go. There is no hassle to it, and you dont see it afterwards. Saying that the GTX's have less heat, money, and power, but then before that saying that 256MB GTX's are not the competition for the XT is silly. As the 512MB GTX's keep much more air inside the case. And the fan on the 512MB's do not allow for good cooling for the cards. The XT's get air from the front of the case, or the back of the card. So it doesnt matter how closely the two cards are to each other. The bottom GTX is going to be stifled by the top GTX, as it gets the air from the top of the heatsink. Not to mention that 512MB's GTX's will be a lot more expensive than XT's, and the XT's use less power. So you see, saying that 256MB GTX's are not the competition for the XT's, but then using the 256MB GTX's lower heat, power, and lower cost to argue that they are better, is just silly. Some people like to use the lower prices of the 430/1200 GTX's as an argument that they are cheaper, but then want the higher clocked ones compared to when talking about performance. You cant have your cake, and eat it too. So to speak. Not to mention SuperAA takes much less of a hit, than SLI AA. It is actually usable, where as SLI AA often times is not. That is one case where Crossfire is clearly better than SLI.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: mwmorph
1. Well to be fair, if yuo are gonig to spend 1k for graphics, might as well go 16x af. I mean, you hive the power to take 16x even if it's a slightly bigger hit, so why not anyway.
For the most part I agree with you, but in games like FEAR or COD2 the 16XAF just might not be worth it. Also, in some online games a person might weigh resolution vs AF vs minimum fps.

2. Valid argument
Agreed.

3. But then what is the direct competition? The 7800gtx512s arent technically x1800xt direct competition either, being in such short supply and with the ihgh prices and all. Does anyone do Evga 7800gtx KO Sli benchmarks? That seems to be a fair comparo.
I have no good answer to that one. I don't believe the RAM that was available late this year for X1800XTs was available early this year for 256 GTXs, which pretty much hit the street in MASS quantity, so you know they had been building them a while. Then there's the whole 512 vs 256 aspect. I don't think the X1800XT has a direct competitor, due to the 512GTXs price difference, so I suppose you're right, 490/1300 256GTXs are as close as we can come.

4. Not sure wjhat you mean by better compatibility, i always thought settings were suimilar.
http://www.anandtech.com/video/showdoc.aspx?i=2649&p=6
Black and White 2 is completely unplayable with CrossFire.
http://www.firingsquad.com/hardware/ati_radeon_x1800_xt_crossfire/page10.asp
Due to a bug in the CATALYST drivers, the X1800 cards don?t begin to really shine in F.E.A.R. until CATALYST A.I. is disabled....Based on how disappointing our dual-card CrossFire results are though, we have a sneaky suspicion that the driver is holding us back. Hopefully ATI will get all this sorted out...
http://techreport.com/reviews/2005q4/radeon-x1800-crossfire/index.x?pg=6
Unfortunately, we decided not to test with high-dynamic-range lighting on this game because it didn't appear to work correctly on Radeon X1000-series cards. Perhaps a future patch or driver update will resolve the problem.
etc.

I mean there's half screen split, every other frame and ati has the horrid tiles thing.
Well the "horrid tiles" is a default setting you can't change, AFAIK. With SLI, you use a drop down menu to switch between SLI modes to see which works best, in Crossfire, you can't switch them at all, AFAIK.

The cables you will never have to bother with or see after install so I dont understand why everyone hates them.
This is just aesthetics for the most part.

Heat and power requirements is valid and so is maturity.
Agreed.

What is this checking every boot? I'm not all that familioar with Xfire but what's this reboot check? Does it turn itself off every startup?
This is the one thing I might be wrong on, but I could swear I read somewhere ATI recommends verifying it's active before playing. It seems too far fetched to be true, but I've got it at the back of my mind Crossfire has had problems holding the setting.

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
I do agree SLI AA takes too much of a hit, but overall IQ is similiar to that of 8xS, which takes much lessser hit than SLI AA.
So with SLi, most people should be using 8xTR S. I dont know why NV came up with SLI AA, but i hope it gets better.

As of now, SLi is the better option for the consumer.
SLi is matured.
SLi is stable.
SLi has alot MORE options to choose from as the consumer. There is many, many motherboards out there for SLI. Near all NV cards are SLiable.

Right now, i think there is a grand total of 2 Crossfire boards. AFter maybe 4 or so months, i think crossfire will make sense. AS of now, buying crossfire instead of SLI isnt very logical.

Thats my 2 cents for this thread.
 

SparkyJJO

Lifer
May 16, 2002
13,357
7
81
lol I saw the title of the thread and thought it was another biased rollo thread ;)

from the looks of the results it appears that they are about evenly matched. Now the nVidia is a bit cheaper. But when getting to the 512MB cards well ATI looks better price-wise - besides, you won't be waiting for years for another 512MB 7800gtx.... ;)

I'd say, both about the same, go for what you like more. If I build a desktop again (got a laptop right now) it'll probably be the 512MB X1800XT due to price and availability, I don't feel like waiting forever for an overpriced 7800GTX (c'mon nvidia, get some more cards out there, people don't like being price gouged! :roll: ). Tho I don't think I'll be doing SLI/Crossfire, money reasons
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Two reasons Crossfire will never see the inside of my case:

1. External dongle (WTF?)

2. Mastercards

Mastercards have to be the most annoying thing ever. Nvidia definately got it right with mixed vendor support for SLI. Gives you soooo many upgrade options down the road if you choose to go SLI.
 

Alexstarfire

Senior member
Jul 25, 2004
385
1
76
Originally posted by: mwmorphthen there's 16xaa and 14xaa and transperency/adapative aa. not realyl much of a difference.

That's a load of crap. I'll have to go find the article, but I know that the AA ATI implements with Crossfire cards is much different then that of the way nVidia does there SLI AA. It's something like one card is used only for the extra AA and the other card handles normal AA and the program. Might not be it exactly, but they said the performance hit was MUCH, MUCH larger than nVidia's 16x SLI AA. I don't think you'd really use either one that much, though, so it probably doesn't matter as much. I believe that nvidia's looked a tab bit better, but that could've been the other way around, same with Adaptive vs Transparency AA, but I also believe that ATI's Adaptive AA performed better, so it's kind of a tradeoff there.

If something can find the article talking about ATI's Crossfire AA nad nVidia's SLI AA before I do, will you please post it? Same goes with the Adaptive vs Transparency AA.
 

Sc4freak

Guest
Oct 22, 2004
953
0
0
Originally posted by: Alexstarfire
Originally posted by: mwmorphthen there's 16xaa and 14xaa and transperency/adapative aa. not realyl much of a difference.

That's a load of crap. I'll have to go find the article, but I know that the AA ATI implements with Crossfire cards is much different then that of the way nVidia does there SLI AA. It's something like one card is used only for the extra AA and the other card handles normal AA and the program. Might not be it exactly, but they said the performance hit was MUCH, MUCH larger than nVidia's 16x SLI AA. I don't think you'd really use either one that much, though, so it probably doesn't matter as much. I believe that nvidia's looked a tab bit better, but that could've been the other way around, same with Adaptive vs Transparency AA, but I also believe that ATI's Adaptive AA performed better, so it's kind of a tradeoff there.

If something can find the article talking about ATI's Crossfire AA nad nVidia's SLI AA before I do, will you please post it? Same goes with the Adaptive vs Transparency AA.

Now THAT is a load of crap. Super AA is much, much slower than SLI AA?

Uh-huh.
http://www.techreport.com/reviews/2005q4/radeon-x1800-crossfire/index.x?pg=12

Uh-huh.
http://www.firingsquad.com/hardware/ati_radeon_x1800_xt_crossfire/page11.asp

Uh-huh.
http://www.firingsquad.com/hardware/ati_radeon_x1800_xt_crossfire/page12.asp
 

jrphoenix

Golden Member
Feb 29, 2004
1,295
2
81
Originally posted by: Matt2
Two reasons Crossfire will never see the inside of my case:

1. External dongle (WTF?)

2. Mastercards

Mastercards have to be the most annoying thing ever. Nvidia definately got it right with mixed vendor support for SLI. Gives you soooo many upgrade options down the road if you choose to go SLI.

I don't think the dongle would be much of an annoyance for me. After plugging in my DVI cable on my new build (July 04), I only unplugged it to move from an apartment to a new home :p

I think ATI is still testing the waters and putting products out there to be competitive for review sites and early adopters. I honestly believe that in the near future ATI will switch to no matercard in their Xfire setups. I have read a few review sites that have take X1600 and Xfired them with no mater card. It is just a matter of getting the product where they need to be which will take time.