1900XT vs 7900GX2

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
THis is why im holding onto my money for G80/R600 :D

Actually, if i start digging reviews of 7900GT SLi vs X1900XTX, it shows the GTs are 20~30% faster (~40% at OpenGL titles). Its only in Oblivion (and some other games like HL2) where it seems to favor ATi cards in general.

IQ in 2d goes to ATi. Ive seen it the other day. I thought the LCD was changed or something. But clearly it was alot sharper.

Onto Features.
Pure Video right now is better than AVIVO with HD/BLu ray support, not only this but beating AVIVO in the HQV test. (hot hardware). So is SLi. Its much more mature than Xfire.

One thing ive noticed is that the dual DVI is on the 1st card of the 7950GX2 meaning that the 2nd card on the bottom which lacks space can blow the hot air OUT of the case. Water cooling these babies wont be too hard either as you can take the 1st card off.

It kind of sad because ATi doesnt support more AA modes like 8xS 16xAA etc. (Bit-tech shows that the 7950GX2 can play BF2 at Max detail 16x12 8xTR S HQ in drivers). But then HQ AF looks much better than NVs angle dependent AF.



 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: nitromullet
Which one did you get, Joker? Aside from losing an eye from the AF, how do you like it?


I have a 7900 Go GS card in my laptop. It has 20pp/7vs and I got it clocked at 540/1150 right now which puts its performance above a stock 7900 GT. Overall it's a decent card, for a laptop card it kicks ass. However it's got the same old visual problems my 7800 GTX had, namely with the AF being subpar. I haven't tried the higher AA modes yet since I don't own any old games and in newer ones it'd crawl. 2D quality is definitely not as good as my X1900 XTX and the colors don't have the same vibrancy, even with DV enabled. Aside from that, I'm happy with it and if a person isn't as picky about AF as I am, they'd be happy with ATi or nVidia at this point.

Edit check it out: http://img195.imageshack.us/my.php?image=59011009wr.jpg
This sucker still has a lot of OC headroom left in it which is pretty damn nice. Best of all I was able to volt mod it from 1.0v to 1.24v using nibitor and flashing it's bios. I'm pretty sure I can reach 630 core on it no problem.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
You are deflecting the argument, but whatever I'll allow that and follow along..
I don't see how mentiong AA/AF features (or lack thereof) is in any way deflecting an arguement that concerns a statement made about the unmatched AA/AF of the 7950GX2.

I'm not going to sit here and argue about HDR+AA on ATI though, its very silly to continue discussing this..
Of course it's silly... NV has nothing to match it. Period. So, there isn't anything to really discuss. You can bring in all your perceived negative aspects of ATI's HDR+AA implementation, but that doesn't change the fact that NV can't do it at all. This is similar to when before ATI had CrossFire working and ATI fans pointed out flaws in SLI - it didn't change the fact that NV had a leg up.

As far as performance/image quality.. dual 7900GTs have always outmatched any single card from ATI.. and that was never disputed.
I am actually disputing that right now soley based on the fact that ATI's AF is currently unmatchable by any number or configuration of NV cards.

Ye sound a bit bitter nitromullet. "Losing an eye" from NV's AF? LOL.. I lost an eye from ATIs horrid AA!!! Lets get real around here, my fellow nerds..
This was not meant to be argumentative, and it was directed towards Joker. He had mentioned the "eye gouging" AF on NV cards right after I had just picked up my second 7900GTX, and I responded with a . ) and a lighthearted comment. Not everything is an argument...

If you really think your X1900XTX is better than two 7900GTs, you're on crack. So I dont know why you are attempting to allude to the idea that somehow the X1900XTX is better than a 7950GX2.

Well, "better" is a subjective term, and is really a matter of needs vs. cost. Let me give you some background. First I owned 2 7900GTX's, which were neat, but not a very good value for the money (almost $1200 worth of video cards). I ebayed them after about 2-3 weeks for virtually no loss, and picked up an HIS X1900XTX (for under $500), which consequently died on me and is still in the RMA process after a month (HIS sucks!). So, I picked up two eVGA 7900GT CO SC's, which were complete sh1t and very unstable. For $600+ (and the fact that the GTX's had run flawlessly), I wasn't about to tech these for too long - they went back to Monarch the very next day. I opted to go with a more reputable brand XTX, and picked up a BBA X1900XTX (and a warrantee from Monarch). I haven't touched the clocks on this card, and it runs completely fine at stock speeds, plus it has outstanding AF IQ and I enjoyed playing Oblivion with HDR+AA. It has met my needs up to this point.

I don't actually think that the Radeon is better then any of the 7900 series cards per se, nor did I ever say that (you put those words into my mouth). If you've read any of my posts concerning a comparison between ATI and NV IQ, I've always given the nod to NV with regards to their non-HDR AA. However, I simply content that to say that any NV configuration is unmatched with respect to AA/AF is false simply because they have inferior AF by design to ATI and they don't support HDR+AA at all.

All that being said, I still might pick up a 7950GX2, this card is pretty cool and it's not a bad price for what you get. : )
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Crusader
You are deflecting the argument, but whatever I'll allow that and follow along..

HDR+AA is meaningless with todays current performance levels, unless you have X1900 Crossfire which then you deal with extreme heat, noise and power consumption. Even then, review sites are showing it to slow for high resolutions (such as 19x12) which is what most people with Xfire run.
Regardless of that point, no current games that anyone is still playing support it. Oblivion can do it through a hack from ATI. Until the dev integrates that code into a patch, it will always be an unsupported hack.
Im not trying to degrade it per se.. but it is what it is.

The biggest issue is that Oblivion is the only attractive game that can use ATIs HDR+AA.
Being from a 3rd party hack is not a bonus, while it doenst bother me.. it is a shame. But its still just 1 game of very, very, very many on the market.
Not much of a selling point.
I'm not going to sit here and argue about HDR+AA on ATI though, its very silly to continue discussing this..
To ATI fans having hack support with 1 major game is the 2nd coming of Christ.. and to the rest of us.. it just is what it is (which isnt much to speak of, esp considering the slow performance HDR+AA.. its a next gen feature due to ATIs horrid performance while using it).

Now that I've said my piece on HDR+AA I'm not going to continue discussing ATIs attrocious HDR+AA performance.

To cut to the chase here-
As far as performance/image quality.. dual 7900GTs have always outmatched any single card from ATI.. and that was never disputed.

Why the sudden change of heart on the 7950GX2? Its even better than dual 7900GTs in nearly every aspect.. and will someday have quadSLI enabled.

Ye sound a bit bitter nitromullet. "Losing an eye" from NV's AF? LOL.. I lost an eye from ATIs horrid AA!!! Lets get real around here, my fellow nerds..

If you really think your X1900XTX is better than two 7900GTs, you're on crack. So I dont know why you are attempting to allude to the idea that somehow the X1900XTX is better than a 7950GX2.

Calling HDR+AA meaningless without ever having tried it is kinda foolish. Oblivion is not the only game that can be run with HDR+AA. But it is more stressful than other games, and even Oblivion runs decently on a single xt at 1280 res with not only HDR+AA, but also Quality Adaptive AA that makes the foliage look much better. Calling it a third party hack is a knock on the developers of Oblivion who came up with a lame excuse that it couldnt be done. Nvidia users would appreciate such a "hack" just as much if it was available, but it's not, and never will be even with Quad SLI'd 7950gx2's because their hardware just doesnt have that ability. Same thing goes for HQ AF, and that feature benefits just about every 3d game, not only select few. The gx2 has its own advantage in speed, and its the fastest card to work in a single pci-e slot, but it also costs more, so you get what you pay for. It's not as if the gx2 cost the same as the xtx and all you have to do is decide if you want more performance or more features.
 

mooncancook

Platinum Member
May 28, 2003
2,874
50
91
Originally posted by: Crusader
You are deflecting the argument, but whatever I'll allow that and follow along..

HDR+AA is meaningless with todays current performance levels, unless you have X1900 Crossfire...
I'll have to disagree judging from my gaming experience with Oblivion. I play with pretty much everything max at 1680x1050 with HDR+4xAA with a single 1900XT at stock speed and game paly is very smooth. and yes, having AA enabled makes a big difference visually to me.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Same here. I am trying the XTX over my GTX and not only do you get very good performance at 1680x1050 w/ 4xAA and HDR, I noticed that in battle scenes in the forest, the FPS barely drop (as in other games)

With my OC'ed GTX they do drop
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Hmm, more people say they get playable frames with HDR+AA in Oblivion. Imagine that.

I think nitromullet hit the nail on the head, several times.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: nitromullet
You are deflecting the argument, but whatever I'll allow that and follow along..
I don't see how mentiong AA/AF features (or lack thereof) is in any way deflecting an arguement that concerns a statement made about the unmatched AA/AF of the 7950GX2.

Well, because i was referring to speed. But NV AF IQ is not that big of a deal, nor is it a deal breaker (nor is it going to put out anyones eye any more than ATIs horrible AA ;) )

I'm not going to sit here and argue about HDR+AA on ATI though, its very silly to continue discussing this..
Of course it's silly... NV has nothing to match it. Period. So, there isn't anything to really discuss. You can bring in all your perceived negative aspects of ATI's HDR+AA implementation, but that doesn't change the fact that NV can't do it at all. This is similar to when before ATI had CrossFire working and ATI fans pointed out flaws in SLI - it didn't change the fact that NV had a leg up.

A checkbox feature is just that, a checkbox feature. And it only "works" thru an unsupported hack on 1 popular game out of thousands.
Its not really a leg up.. and I'm not trying to go fanboy on you here, thats what I see as the truth.

As far as performance/image quality.. dual 7900GTs have always outmatched any single card from ATI.. and that was never disputed.
I am actually disputing that right now soley based on the fact that ATI's AF is currently unmatchable by any number or configuration of NV cards.

I am actually disputing that right now soley based on the fact that NV's AA is currently unmatchable by any number or configuration of NV cards.


Ye sound a bit bitter nitromullet. "Losing an eye" from NV's AF? LOL.. I lost an eye from ATIs horrid AA!!! Lets get real around here, my fellow nerds..
This was not meant to be argumentative, and it was directed towards Joker. He had mentioned the "eye gouging" AF on NV cards right after I had just picked up my second 7900GTX, and I responded with a . ) and a lighthearted comment. Not everything is an argument...

I didnt mean to come across entirely serious.. but just the actual phrase was ridiculously funny and prosposterous I wondered if the fanboy formula had found its way in unforeseen amounts into your water supplies...

If you really think your X1900XTX is better than two 7900GTs, you're on crack. So I dont know why you are attempting to allude to the idea that somehow the X1900XTX is better than a 7950GX2.

Well, "better" is a subjective term, and is really a matter of needs vs. cost. Let me give you some background. First I owned 2 7900GTX's, which were neat, but not a very good value for the money (almost $1200 worth of video cards). I ebayed them after about 2-3 weeks for virtually no loss, and picked up an HIS X1900XTX (for under $500), which consequently died on me and is still in the RMA process after a month (HIS sucks!). So, I picked up two eVGA 7900GT CO SC's, which were complete sh1t and very unstable. For $600+ (and the fact that the GTX's had run flawlessly), I wasn't about to tech these for too long - they went back to Monarch the very next day. I opted to go with a more reputable brand XTX, and picked up a BBA X1900XTX (and a warrantee from Monarch). I haven't touched the clocks on this card, and it runs completely fine at stock speeds, plus it has outstanding AF IQ and I enjoyed playing Oblivion with HDR+AA. It has met my needs up to this point.

I don't actually think that the Radeon is better then any of the 7900 series cards per se, nor did I ever say that (you put those words into my mouth). If you've read any of my posts concerning a comparison between ATI and NV IQ, I've always given the nod to NV with regards to their non-HDR AA. However, I simply content that to say that any NV configuration is unmatched with respect to AA/AF is false simply because they have inferior AF by design to ATI and they don't support HDR+AA at all.

All that being said, I still might pick up a 7950GX2, this card is pretty cool and it's not a bad price for what you get. : )

I've also done a lot of card swapping lately. Dev houses are gearing up to work on SourceDX10/Doom4/UnrealEngine4.0 and now is the time to buy as those engines wont be released for years.
No point in having DX10 till then. Maybe they will have added all the features that Vista was supposed to have by that time as well! ;)

I have no quarrel with the X1900XT. That is the only card from ATI that I find attractive.
I'm going to be brutally honest when I say its the absolute only card I'd even remotely consider from their lineup though.
Reason I dont include the XTX is because if one goes xfire you are stuck at XT speeds.. and both cards are pretty close anyway.

I DO have a quarrel with using HDR+AA as a selling point as I see its inability outside of Crossfire (at least according to review sites, which I of course take over the opinion of some random forum-goer) to push it at any decent LCD res.

Is it NICE? Yes.. but its not a deal-maker. I'd agree with you the AF is a much better selling point as its actually useful and applies to all games.

So I like how it appears your head is on straight in that regard.
I personally do not feel the heat/power/noise of the X1900 series is worth a minor improvement in AF, alongside losing SLI ability and Nvidia's driver support in Windows and Linux, and NV's AA.

Personally, I'm happier with this single GTX than I have been with any single or multicard setup I've ever had.
I used to like the X1900XT, and still do (sort of) but now if I moved to anything else, it'd def be the GX2.
I currently run 4xAA/16xAF, High Quality, Vsync On, Triple Buffering On in my global profile. Always silky smooth, its orgasmic on my 16x10 (2005FPW) display.
Games I play include a ton of CSS, large quantities of Civilization 4, still trying to finish up Quake4 SP, the occasional Warcraft 3 TFT battle.net match, and have been meaning to finish HL2 and get started on Oblivion.
I want to pickup HOMM5 as I've been a huge fan since the original DOS release (I'm way old school).
Most of my time outside of work is spent unwinding in CSS or Civilization 4.

Gave up on that buggy BF2 and I dont like how they put Europeans in the game.. since when in recent times was the EU a warrior nation-state like the USA?
Its also a bit repulsive to see a fictional "Middle East Coalition" as if they'd ever have (or are capable of) an actual deployable formidable standing army. :disgust: Please.
China is sorta realistic, but they still havent proven their mettle in the art of war vs someone who knows how to fight and kill in war (like the USA does).
Yeah its just a game (based on a predicted WW3), I know.

Anyway, for those with worthy high res LCDs (higher than mine).. GX2 the only way to go IMO.
If I get one I will also be moving to a 2407FPW. :thumbsup:
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
After using my new 7900 video card and my XTX I'm more confident than ever in declaring the ATi solution superior when it comes to IQ. ATi's 2D quality is better, AA looks a bit better (due to ATi's superior colors), and of course AF blows away the 7900. The GX2 is a desperate attempt by nVidia to regain the performance crown and I still haven't seen any benches comparing it to an XTX with max IQ options at high resolution. Either way, I'd much rather spend the cash on Crossfire than buy a $600 card based on an inferior architecture with lacking features.
 

LW07

Golden Member
Feb 16, 2006
1,537
2
81
It's not worth upgrading from the X1900XT to a 7900GX2. I just say wait until Conroe or Windows Vista to come out and get most of the bugs worked out before worring about upgrading, as the X1900XT will be fast enough until at least then(well, maybe not until Windows Vista has most of the bugs worked out, but who knows for sure?).
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Crusader
Originally posted by: nitromullet
You are deflecting the argument, but whatever I'll allow that and follow along..
I don't see how mentiong AA/AF features (or lack thereof) is in any way deflecting an arguement that concerns a statement made about the unmatched AA/AF of the 7950GX2.

Well, because i was referring to speed. But NV AF IQ is not that big of a deal, nor is it a deal breaker (nor is it going to put out anyones eye any more than ATIs horrible AA ;) )

I'm not going to sit here and argue about HDR+AA on ATI though, its very silly to continue discussing this..
Of course it's silly... NV has nothing to match it. Period. So, there isn't anything to really discuss. You can bring in all your perceived negative aspects of ATI's HDR+AA implementation, but that doesn't change the fact that NV can't do it at all. This is similar to when before ATI had CrossFire working and ATI fans pointed out flaws in SLI - it didn't change the fact that NV had a leg up.

A checkbox feature is just that, a checkbox feature. And it only "works" thru an unsupported hack on 1 popular game out of thousands.
Its not really a leg up.. and I'm not trying to go fanboy on you here, thats what I see as the truth.

As far as performance/image quality.. dual 7900GTs have always outmatched any single card from ATI.. and that was never disputed.
I am actually disputing that right now soley based on the fact that ATI's AF is currently unmatchable by any number or configuration of NV cards.

I am actually disputing that right now soley based on the fact that NV's AA is currently unmatchable by any number or configuration of NV cards.


Ye sound a bit bitter nitromullet. "Losing an eye" from NV's AF? LOL.. I lost an eye from ATIs horrid AA!!! Lets get real around here, my fellow nerds..
This was not meant to be argumentative, and it was directed towards Joker. He had mentioned the "eye gouging" AF on NV cards right after I had just picked up my second 7900GTX, and I responded with a . ) and a lighthearted comment. Not everything is an argument...

I didnt mean to come across entirely serious.. but just the actual phrase was ridiculously funny and prosposterous I wondered if the fanboy formula had found its way in unforeseen amounts into your water supplies...

If you really think your X1900XTX is better than two 7900GTs, you're on crack. So I dont know why you are attempting to allude to the idea that somehow the X1900XTX is better than a 7950GX2.

Well, "better" is a subjective term, and is really a matter of needs vs. cost. Let me give you some background. First I owned 2 7900GTX's, which were neat, but not a very good value for the money (almost $1200 worth of video cards). I ebayed them after about 2-3 weeks for virtually no loss, and picked up an HIS X1900XTX (for under $500), which consequently died on me and is still in the RMA process after a month (HIS sucks!). So, I picked up two eVGA 7900GT CO SC's, which were complete sh1t and very unstable. For $600+ (and the fact that the GTX's had run flawlessly), I wasn't about to tech these for too long - they went back to Monarch the very next day. I opted to go with a more reputable brand XTX, and picked up a BBA X1900XTX (and a warrantee from Monarch). I haven't touched the clocks on this card, and it runs completely fine at stock speeds, plus it has outstanding AF IQ and I enjoyed playing Oblivion with HDR+AA. It has met my needs up to this point.

I don't actually think that the Radeon is better then any of the 7900 series cards per se, nor did I ever say that (you put those words into my mouth). If you've read any of my posts concerning a comparison between ATI and NV IQ, I've always given the nod to NV with regards to their non-HDR AA. However, I simply content that to say that any NV configuration is unmatched with respect to AA/AF is false simply because they have inferior AF by design to ATI and they don't support HDR+AA at all.

All that being said, I still might pick up a 7950GX2, this card is pretty cool and it's not a bad price for what you get. : )

I've also done a lot of card swapping lately. Dev houses are gearing up to work on SourceDX10/Doom4/UnrealEngine4.0 and now is the time to buy as those engines wont be released for years.
No point in having DX10 till then. Maybe they will have added all the features that Vista was supposed to have by that time as well! ;)

I have no quarrel with the X1900XT. That is the only card from ATI that I find attractive.
I'm going to be brutally honest when I say its the absolute only card I'd even remotely consider from their lineup though.
Reason I dont include the XTX is because if one goes xfire you are stuck at XT speeds.. and both cards are pretty close anyway.

I DO have a quarrel with using HDR+AA as a selling point as I see its inability outside of Crossfire (at least according to review sites, which I of course take over the opinion of some random forum-goer) to push it at any decent LCD res.

Is it NICE? Yes.. but its not a deal-maker. I'd agree with you the AF is a much better selling point as its actually useful and applies to all games.

So I like how it appears your head is on straight in that regard.
I personally do not feel the heat/power/noise of the X1900 series is worth a minor improvement in AF, alongside losing SLI ability and Nvidia's driver support in Windows and Linux, and NV's AA.

Personally, I'm happier with this single GTX than I have been with any single or multicard setup I've ever had.
I used to like the X1900XT, and still do (sort of) but now if I moved to anything else, it'd def be the GX2.
I currently run 4xAA/16xAF, High Quality, Vsync On, Triple Buffering On in my global profile. Always silky smooth, its orgasmic on my 16x10 (2005FPW) display.
Games I play include a ton of CSS, large quantities of Civilization 4, still trying to finish up Quake4 SP, the occasional Warcraft 3 TFT battle.net match, and have been meaning to finish HL2 and get started on Oblivion.
I want to pickup HOMM5 as I've been a huge fan since the original DOS release (I'm way old school).
Most of my time outside of work is spent unwinding in CSS or Civilization 4.

Gave up on that buggy BF2 and I dont like how they put Europeans in the game.. since when in recent times was the EU a warrior nation-state like the USA?
Its also a bit repulsive to see a fictional "Middle East Coalition" as if they'd ever have (or are capable of) an actual deployable formidable standing army. :disgust: Please.
China is sorta realistic, but they still havent proven their mettle in the art of war vs someone who knows how to fight and kill in war (like the USA does).
Yeah its just a game (based on a predicted WW3), I know.

Anyway, for those with worthy high res LCDs (higher than mine).. GX2 the only way to go IMO.
If I get one I will also be moving to a 2407FPW. :thumbsup:

I also run with 4xAA in ALL of my games including Oblivion. The X1900XTX can take any game I have with 4xAA,16xHQAF,1680x1050,Vsync-I hate screen tearing, triple buffering (except Oblivion--got have that extra visual memory for my MODS!) and all at stock speeds. I have put back on the stock cooler with AS5, and it does better than my Zalman VF-900 did with the fan at 100%. (It keeps it cooler, but not at all as quiet). Temps never go above 73C--even at 700/790 speeds. I remember my 7800's getting about that hot with lower frequencies and less features. I'm impressed with ATI--and mad at Zalman.

My point is Crusader, that you and I both run very similar settings, and while you claim that NV's AA is better, the same argument you make with HDR+AA not being playable can easily apply itself to NV's 8xAA. Fact is, NV and ATI are even in 2xAA and 4xAA. I see 6xAA as a level that cannot be coupled with any Adaptive AA because of the performance hit, and 8xAA as not even playable. So, I don't really know why you've lost an eye for ATI's AA since the level of NV's AA is the same with the settings you play at.

Other than that, you've actually given some credit to some of ATI's development here which I've never seen you do. This post of yours I quoted is the first one that I've seen of yours that has actually been less ignorant and one-sided. Thankyou for actually giving ATI some overdue credit. You must see that without ATI, Nvidia wouldn't drive for better products, so ANY gamer should really give credit to both companies.:beer: Competition is great.