• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

2 7800 GT's in SLI or 1 X1900XT?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: beggerking
Originally posted by: Steelski
Originally posted by: munky
Well, I can sort of read French based on what I remember form high school, but I can also use google translate 😀. The french site linked a few posts above mentions that they used the maximum available AF in all tests, with or without AA. They also said when the requested AF mode was not available from within the game, they forced 16x from the drivers control panel. Moreover, they disabled the filtering optimizations on NV cards to reduce the texture shimmering.

what a laugh.

why? care to explain?
or are you just expressing your ATI fanboism?

I laugh at the shimmering issue that could so easily be avoided by Nvidia.
 
Originally posted by: beggerking
Originally posted by: Steelski

please stop.

What do you mean? we have the right to discuss / question / investigate benchmarks, why stop?

because there was evidence that the GT's were not as fast yet it was not aknoledged. in all reviews that you see with minimal frames, you will see that the 1900 does a particularly good job. end of.
 
Originally posted by: Wreckage
A dual slot cooler is not that much less heat, noise and space.


Are you arguing that 2 GT's in SLI aren't going to produce more heat than a single card solution?

Originally posted by: Wreckage
It's funny how ATI dongle boys were saying how much HDR sucked on the 6800's saying it was slow and on a limited number of games but now that the same is true for ATI, it's hotter than a monkey fart. :roll: :roll: :roll: :roll: :roll: :roll: :roll:

Well back when that argument was made it was absolutely true, about this time last year when such things were being said there was almost no HDR in any games. Also I bet a lot of the folks who bought 6800's way back then and were touting SM 3.0 and/or HDR support never took advantage of it and have since upgraded to 7800's. So in essence at that particular point HDR was useless.

 
Originally posted by: Steelski
WWOOOOOOOOOOOOOWWW, you really think outside the box apart from your pathetic 1fps wins which will dissapear in a month.
"it would have better numbers for a refresh......."its by far the biggest ever leap of a refresh. especially when you consider its with the same clocks and it will prove sustainable compared to the GTX512.
When the GTX came out it was faster than LAST GEN CARDS!!!!!!!!!!!!! big whoop.
Now that the X1900 cards are out they are fater than the GTX's. and are comparably as fast as 2 x gt's with unoptimised drivers. it does not take a genius that the one card is doing very well all things considereing.
Your like a moron that bought 2 6600's(at the same time) when he could have had 1 6800 GT/ultra and run things as well and more stable.
I would say it is very close. 2-GTs are about $570. A X1900XT is $540. I think either choice would be good with a slight nod to the X1900XT. I really don't think there is a need to get all fanboyish like this though. The X1900XT was beat in most of the benchmarks albeit not by much.
 
Originally posted by: ayabe
Originally posted by: Wreckage
A dual slot cooler is not that much less heat, noise and space.


Are you arguing that 2 GT's in SLI aren't going to produce more heat than a single card solution?

Originally posted by: Wreckage
It's funny how ATI dongle boys were saying how much HDR sucked on the 6800's saying it was slow and on a limited number of games but now that the same is true for ATI, it's hotter than a monkey fart. :roll: :roll: :roll: :roll: :roll: :roll: :roll:

Well back when that argument was made it was absolutely true, about this time last year when such things were being said there was almost no HDR in any games. Also I bet a lot of the folks who bought 6800's way back then and were touting SM 3.0 and/or HDR support never took advantage of it and have since upgraded to 7800's. So in essence at that particular point HDR was useless.
It was used in 1 game back then. Now it's used in 4. I really wouldn't consider that a huge difference.
 
Originally posted by: Steelski
Originally posted by: keysplayr2003

No mention of minimum framerates? Because that is much more important (to me) than max framerates. 7800GT SLI vs. X1900XTX minimum/average framerates is what I would like to see.

please stop.

Please stop what? Asking questions?

EDIT: steelski quote: "because there was evidence that the GT's were not as fast yet it was not aknoledged. in all reviews that you see with minimal frames, you will see that the 1900 does a particularly good job. end of. "

How does "particularly good job" address my question? Particularly good job does not tell me if it was better, nor worse than SLI'd 7800GT's. Please link to reviews with minimal framerates. Thanks.

 
i've skipped the entire 5th page of the thread, seemed alot of flaming, so forgive me for that ^^

i was just looking at the anand review. i know it doesnt have a GT SLI in it, but still. i've heard FEAR is a shader heavy game, and at 1920x1200 4AA, a single 1900xt seems to do better then a 7800gtx 512 SLI. imagine what a 7800 gt setup gets. i wont buy a card based on a single benchmark, but if shaders are the future this is a pretty good card. in other benchmarks it performs well too. comparing anand's power numbers, i'd think power wise it makes no difference, noise wise same.

i bought my 1900xt yesterday, but i've been thinking about VGA cards when the x850XTPE wasnt even out. i've also considered the GT's. and i decided against them because of sheer performance. i dont expect a single card to crush them, but that 1900xtx comes close in just about every benchmark (i can OC my XT 25 mhz i'd think) and thats enough for me. also the IQ's better. but most of all, if i want more performance in the future... i'll just buy another 1900xt (by that time, they'll be what? 300 bucks?) and upgrade my mobo. but with 2 GT's, you cant do that. you'd have to go back to a single card of about 500-600 bucks to get more performance. also im still expecting the 1900xt to shape up a bit with better drivers while the 7800gt's are pretty mature. looking for performance now, yes, i would go for the 7800GT's in SLI. but my PC needs to last me for nearly 3 years. and in the future, the 1900xt will definatly kill the SLI'd 7800GTs. thus my choice.

now i gotta wait a week before the thing arrives -_-
 
Originally posted by: obsidian

It was used in 1 game back then. Now it's used in 4. I really wouldn't consider that a huge difference.

I didn't say it was a huge difference, but having HDR support and SM 3.0 is certainly a LOT more relevant now then it was a year ago, which was my point.
 
Originally posted by: Wreckage

Craptastic! You had to go all the way to France to find support? Only a few ATI dongle boys question the Tech Reports review while eveyone else considers their work to be of quality.

When have I ever come off like a fanboy???

Like I said in my post, the French site was the only other site to throw the 7800GTs into the fray so that a direct comparison can be made. That's out of like 10-12 review sites I looked at. Should we just close our little eyes and pretend that there aren't other review sites out there??? Are you suggesting that Americans are superior in their ability to conduct a product review?? I guess I'm not understanding your animosity towards people posting the results from other sites.
 
Originally posted by: Wreckage
...

HDR+AA that barely works on a few games? It's funny how ATI dongle boys were saying how much HDR sucked on the 6800's saying it was slow and on a limited number of games but now that the same is true for ATI, it's hotter than a monkey fart....

You are right. It's slow on 6800's (almost useless). 7800's and X1xx is a different story.


 
Originally posted by: Wreckage
Originally posted by: Ackmed
Huh? In the higher settings, the XTX is *faster* than 2xGT's. Who doesnt buy a highend card, and run at highend settings? Not many. They are basically on par, but some are calling 1fps a win. Not to mention (which you ignored) you can flip the argument around, and claim the GT's run faster than the "new" 512MB GTX, for almost half the cost.

The X1900's perform substantionally faster than the GTX's in a few forward thinking games, such as F.E.A.R. And is generally overall faster in other games. Why pin the whole argument on which is faster? Hows about what one card can do, that the others cannot? Such as, HDR+AA in a few games (more to come this year), better AF, Avivo, less power, less noise, less heat, etc.

How do you figure the XTX runs at a lower detail level than NV cards? It doesnt if AF is selected properly.

A dual slot cooler is not that much less heat, noise and space. Several sites that ran IQ comparisons between ATI and NVIDIA using HL2 showed with screenshots that ATI was missing details like plants, branches, powerlines etc. It has been proven that for some reason HL2 runs at a lower detail level on ATI cards. The dual GT's can also do 16xAA something missing from the XTX. AVIVO is no better than Pure Video which has been out for 2 years. On the Tech Report site the dual GT's were still ahead of the XTX even with higher settings.

HDR+AA that barely works on a few games? It's funny how ATI dongle boys were saying how much HDR sucked on the 6800's saying it was slow and on a limited number of games but now that the same is true for ATI, it's hotter than a monkey fart. :roll: :roll: :roll: :roll: :roll: :roll: :roll:


I know its not that much less, but it does exhaust the heat outside. I was just pointing out, that there are pluses to a single XTX over 2xGT's, its not all about performance. The problem with that is, the reviewers selected the AF thru the CP, not the game. When selected thru the game (how it should always be done), it renders it all correctly. Application AF assures that only the textures that need AF, will get AF. And fixes this "problem" in HL2. It is true that 2xGT's can do 16x SLI AA. However, its not going to be playable with any sort of new game. But it is a plus. Avivo has been reviewed a lot, and has been deemed as better than Purevideo by all that I have seen.

You did well until the second part of your post, then the name calling and general mocking started for some reason. If you can find where I said once that HDR sucked, feel free to quote me, as I didn. HDR+AA barely works in a few games? Yeah, its only a few games for now. It works, not sure why you say barely. I didnt say HDR sucked in the 6800 days, I said it wasnt playable, and not worth the loss of AA. Now it is much more playable, and you dont lose AA with ATi's newest cards. You can once again turn that argument around, HDR used to be the hottest item for discussion for NV fans. Now that the roles have reversed, its not. Its not a big issue to me, as I dont generally like the way HDR is in Farcry. I like it coming out of the dark, to the light, but I dont like how the sand shines. Its too "hollywood" for me. Which is what I said way back then. It doesnt do much for me in SS2 either, going by the screen shots. I havent had a chance to use HDR+AA in SS2 yet. There are more games coming out though, that will follow the same trend. NV just cant do it, and thats going to be a negative for some people. Same as the lower quality AF is.

As I said, both setups have ups and downs. If you're going to keep the setup for a long time (1 or 2 years), the XTX looks to be the better choice to me easily. As it is more forward thinking, has better AF, and can do HDR+AA in games they cant, more ram, and the ability to add another down the road if you want to. The GT's are already maxed out. I dont think anyone can tell someone who gets either, that they were "wrong" in doing so. It just all comes down to what you prefer. To me at least.
 
if youve got a gt sli em if not get a 1900,simple.dont understand how people can be so mental over which vid card they use,its not football or war.
 
HDR+AA is a plus, but is it going to playable in these future games???

The HDR+AA benches I ahve seen so far are not astounding, so who's to say that HDR+AA will be playable in future games on an X1900XT/X??
 
Originally posted by: Steelski
Originally posted by: beggerking
Originally posted by: Steelski

please stop.

What do you mean? we have the right to discuss / question / investigate benchmarks, why stop?

because there was evidence that the GT's were not as fast yet it was not aknoledged. in all reviews that you see with minimal frames, you will see that the 1900 does a particularly good job. end of.

Min fram is a part of benchmarking, but it does not constitute being "better" by itself. It is the sum of gaming experience/benchmarking that constitutes a videocard's power.

 
Originally posted by: Ackmed
I know its not that much less, but it does exhaust the heat outside. I was just pointing out, that there are pluses to a single XTX over 2xGT's, its not all about performance. The problem with that is, the reviewers selected the AF thru the CP, not the game. When selected thru the game (how it should always be done), it renders it all correctly. Application AF assures that only the textures that need AF, will get AF. And fixes this "problem" in HL2. It is true that 2xGT's can do 16x SLI AA. However, its not going to be playable with any sort of new game. But it is a plus. Avivo has been reviewed a lot, and has been deemed as better than Purevideo by all that I have seen.

You did well until the second part of your post, then the name calling and general mocking started for some reason. If you can find where I said once that HDR sucked, feel free to quote me, as I didn. HDR+AA barely works in a few games? Yeah, its only a few games for now. It works, not sure why you say barely. I didnt say HDR sucked in the 6800 days, I said it wasnt playable, and not worth the loss of AA. Now it is much more playable, and you dont lose AA with ATi's newest cards. You can once again turn that argument around, HDR used to be the hottest item for discussion for NV fans. Now that the roles have reversed, its not. Its not a big issue to me, as I dont generally like the way HDR is in Farcry. I like it coming out of the dark, to the light, but I dont like how the sand shines. Its too "hollywood" for me. Which is what I said way back then. It doesnt do much for me in SS2 either, going by the screen shots. I havent had a chance to use HDR+AA in SS2 yet. There are more games coming out though, that will follow the same trend. NV just cant do it, and thats going to be a negative for some people. Same as the lower quality AF is.

As I said, both setups have ups and downs. If you're going to keep the setup for a long time (1 or 2 years), the XTX looks to be the better choice to me easily. As it is more forward thinking, has better AF, and can do HDR+AA in games they cant, more ram, and the ability to add another down the road if you want to. The GT's are already maxed out. I dont think anyone can tell someone who gets either, that they were "wrong" in doing so. It just all comes down to what you prefer. To me at least.

Hmmm. Actually a rather calm and well thought out reply. I agree that both solutions will satisfy just about anybody. My "name calling" was not directed at any individual in particular. However when HDR+AA was announced there were no games that supported it yet many people who had been against HDR suddenly supported it simply because "Red" now supported it. Seemed a bit hypocritical to me.

It's also worth noting that we are talking about FP16HDR, while there are other kinds of HDR that both teams can support (even with AA).

In the end we are watching the last gasp out of the DirectX 9 series cards. Its death may come as soon as this summer.


 
Originally posted by: Steelski
Originally posted by: beggerking
Originally posted by: Steelski
Originally posted by: munky
Well, I can sort of read French based on what I remember form high school, but I can also use google translate 😀. The french site linked a few posts above mentions that they used the maximum available AF in all tests, with or without AA. They also said when the requested AF mode was not available from within the game, they forced 16x from the drivers control panel. Moreover, they disabled the filtering optimizations on NV cards to reduce the texture shimmering.

what a laugh.

why? care to explain?
or are you just expressing your ATI fanboism?

I laugh at the shimmering issue that could so easily be avoided by Nvidia.


okay, but still his post explains why the discrepancy in benchmarking on the French site.
 
There are several HDR formats which are related - FP16, FX16, FX10 - all of them are basically the same except that some can represent a wider color range then others (the FX ones are integer-based, the FP are floats). The r5xx cards can apply AA to all of these, and the 7 series can not apply AA to any of the above. The only way to get HDR with AA working on the 7 series is if the devs develop custom HDR through the pixel shaders like in Lost Coast.
 
Another review up comparing the two video solutions in question. I wish they would have stated whether they're using AA and AF but after comparing the fps with other review sites, it looks like they are using at least 4AA/8AF if not more.
 
Originally posted by: Steelski
Originally posted by: beggerking
Originally posted by: Steelski

WWOOOOOOOOOOOOOWWW, you really think outside the box apart from your pathetic 1fps wins which will dissapear in a month.
"it would have better numbers for a refresh......."its by far the biggest ever leap of a refresh. especially when you consider its with the same clocks and it will prove sustainable compared to the GTX512.
When the GTX came out it was faster than LAST GEN CARDS!!!!!!!!!!!!! big whoop.
Now that the X1900 cards are out they are fater than the GTX's. and are comparably as fast as 2 x gt's with unoptimised drivers. it does not take a genius that the one card is doing very well all things considereing.
Your like a moron that bought 2 6600's(at the same time) when he could have had 1 6800 GT/ultra and run things as well and more stable.

I didn't want to bother. But after reading several of your utterly/biased/ridiculous posts and wasted my time I've decided to refute everyone of your AtiFanboism.

"WWOOOOOOOOOOOOOWWW, you really think outside the box apart from your pathetic 1fps wins which will dissapear in a month. "

completely pointless. You don't know whether the driver will improve performance.

"When the GTX came out it was faster than LAST GEN CARDS!!!!!!!!!!!!! big whoop. "

why is this a "big whoop" when 1900xt is in the same situation? or are you trying to say "big whoop" 1900xt too?

"Your like a moron..."

YES, YOU ARE.

I must say you are persistent.

"why is this a "big whoop" when 1900xt is in the same situation? or are you trying to say "
Well. where do i start, the one point would be that X1900 series is just a refresh. next gen is R600. but you wouldent understand that.
your "I know you are but what am i" comment is a bit crap aswell.

How did you come to the conclusion that it is only a refresh? Did you base that on any solid ground? or is it's performance only marks it as a refresh to you? I heard it was supposed to be a new generation until the benchmark came out, so who is to say?

The only thing is, 7800 isn't a "big whoop" as well as 1900xt isn't one. If 7800 is, the 1900xt might as well be one.
 
Originally posted by: Steelski
Originally posted by: beggerking
Originally posted by: Steelski

please stop.

What do you mean? we have the right to discuss / question / investigate benchmarks, why stop?

because there was evidence that the GT's were not as fast yet it was not aknoledged. in all reviews that you see with minimal frames, you will see that the 1900 does a particularly good job. end of.

how was it "not yet" acknowledged? you are not making reasoning/sense.. you don't have any solid ground for everything you say.

Or do you mean that even though GT SLI is faster now, we can't say its faster until / if 1900xt become faster?
 
Originally posted by: munky
There are several HDR formats which are related - FP16, FX16, FX10 - all of them are basically the same except that some can represent a wider color range then others (the FX ones are integer-based, the FP are floats). The r5xx cards can apply AA to all of these, and the 7 series can not apply AA to any of the above. The only way to get HDR with AA working on the 7 series is if the devs develop custom HDR through the pixel shaders like in Lost Coast.

Silence him! He is making nVidia look bad! Be gone!

On a serious note, who needs Jerry Springer when you have the video forum flame wars? I get enough fill for the day after reading about 10 posts.

Edit **

Oh yeah almost forgot, "My daddy can beat up your daddy!"

Edit 2 **

Go with the 1900! If I had the money, I would have purchased one. ATI made a fine card in my book. People are trying to take that away from them, but oh well, people always try and discredit where credit is due. Tis human nature.
 
Originally posted by: beggerking
Originally posted by: Steelski
Originally posted by: beggerking
Originally posted by: Steelski

please stop.

What do you mean? we have the right to discuss / question / investigate benchmarks, why stop?

because there was evidence that the GT's were not as fast yet it was not aknoledged. in all reviews that you see with minimal frames, you will see that the 1900 does a particularly good job. end of.

how was it "not yet" acknowledged? you are not making reasoning/sense.. you don't have any solid ground for everything you say.

Or do you mean that even though GT SLI is faster now, we can't say its faster until / if 1900xt become faster?

Last post to you coz i really seem to get you seeing red. If you cant understand what I say thats your problem. In all fairnesstwo GT's is a good deal, but in light of an XT's performance I would not recomend two GT's instead of 1 XT. do you understand?!?!?! on second thought dont answer back.
Your trying to put a spin on things is annoying.
By unacknowledged I mean that there was evidence the two solutions are very similar in performance and with the french site ahead in performance, then the argument got immediatley switched to minimal frames.
Go to the thread that says 16 reviews of the XT in it and find one with minimal frames in. i'm sure you will find a few that show very good numbers for the XT.
If you already have a GT then get another, but if you want that performance and the advantages of a single card then get the XT. Simple, where you manage to find the nerve to not even admit that this is the best card available for most games is beyond me.
I suggest you dont post back to me because all argument has been explored for this XTvsGT's segment, we covered avarage frames, noise, heat, minimal frames, power requirements, IQ, French and a load of others. This is not a ATI vs Nvidia argument because i would be happy with a 7 series card. its about value for money, from where everyone else is sitting we can see that the XT has slightly more than two GT's in this moment in time. top performance for low dollar.
 
Originally posted by: Steelski
Originally posted by: beggerking
Originally posted by: Steelski
Originally posted by: beggerking
Originally posted by: Steelski

please stop.

What do you mean? we have the right to discuss / question / investigate benchmarks, why stop?

because there was evidence that the GT's were not as fast yet it was not aknoledged. in all reviews that you see with minimal frames, you will see that the 1900 does a particularly good job. end of.

how was it "not yet" acknowledged? you are not making reasoning/sense.. you don't have any solid ground for everything you say.

Or do you mean that even though GT SLI is faster now, we can't say its faster until / if 1900xt become faster?

Last post to you coz i really seem to get you seeing red. If you cant understand what I say thats your problem. In all fairnesstwo GT's is a good deal, but in light of an XT's performance I would not recomend two GT's instead of 1 XT. do you understand?!?!?! on second thought dont answer back.
Your trying to put a spin on things is annoying.
By unacknowledged I mean that there was evidence the two solutions are very similar in performance and with the french site ahead in performance, then the argument got immediatley switched to minimal frames.
Go to the thread that says 16 reviews of the XT in it and find one with minimal frames in. i'm sure you will find a few that show very good numbers for the XT.
If you already have a GT then get another, but if you want that performance and the advantages of a single card then get the XT. Simple, where you manage to find the nerve to not even admit that this is the best card available for most games is beyond me.
I suggest you dont post back to me because all argument has been explored for this XTvsGT's segment, we covered avarage frames, noise, heat, minimal frames, power requirements, IQ, French and a load of others. This is not a ATI vs Nvidia argument because i would be happy with a 7 series card. its about value for money, from where everyone else is sitting we can see that the XT has slightly more than two GT's in this moment in time. top performance for low dollar.

Once again, YOU FAILED TO PROVIDE ANY DATA.
I hope you don't post back to me, because you simple don't know anything other than fanboism.
b/w, this is my last post to you. pointless to argue with someone w/o reasoning.
 
Back
Top