• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

7900GT or X1900XT

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: 5150Joker
Transparency SSAA is equivalent to Adaptive QAA while Transparency MSAA is inferior to Adaptive PAA. OGL performance difference in the latest games isn't much to brag about. The only decent game the nVidia cards pull ahead in is Quake 4..yawn. The best part is ATi users don't have to worry about buggy drivers like nVidia users do. The nVidia user dilemma: Gee should I use 84.17 for FEAR today or 84.25 for Oblivion or should I swap out my driver yet again to avoid the horrible stuttering and crap performance in Tomb Raider legends. I'll take CCC "bloat" (~17 MB memory usage) over nVidia's crappy drivers anyday.

Wait, stop the presses. Everyone knows nVidia has better drivers than ATI. All the nVidia fanboys say so, and what's written on the intardnet must be true! I mean, nVidia's drivers are flawless, once you take into account and sweep under the rug all the bugs.

Originally posted by: Wreckage
Nope the XT needs a $100 or more to do that, a fool and his money are soon parted.

That must explain why all the 7800GTX 512MB's sold out. Though in all fairness, some of the guys buying the cards were just there to ebay them to fools.

Originally posted by: Wreckage
That's right you spent all your money on a video card, just to play one game. Well sucks to be you. When Enemy Territory and Prey come out and Oblivion gets deleted from most peoples hard drives I guess you can finally go outside and play.

Apparently now that the shoes are reversed and ATI has the performance lead it's bad to spend large amounts of money on a video card. And you also recommend SLI systems to people. IMHO, Crossfire and SLI are huge wastes of money. I mean, seriously, arguing a more expensive, and much better performing single video card is a waste of money, yet recommending an even more expensive dual card solution to others at the same time.

So ATI's card will only work with one game, it' won't work with other games and that's why when Enemy Territory and Prey comes out, anyone who has an ATI card will have to go outside and play.
 
Originally posted by: 5150Joker
Originally posted by: Wreckage
Originally posted by: 5150Joker
Unfortunately my wife is the one stuck with nVidia's subpar drivers and image quality on her laptop using the nVidia 6800 Go Ultra.

I didn't know that "Extelleron" had a laptop.


LOL that's pretty good but you still didn't address the part where you claimed nVidia's TRAA was superior and I proved you wrong (as usual).

http://www.hothardware.com/viewarticle.aspx?page=6&articleid=777&cid=2

The ATI 6XAA vs. NVIDIA 8xS AA shots reveal similar differences, with NVIDIA having a clear edge in detail, especially in the trees where ATI's multi-sample only algorithm has minimal impact.

NVIDIA clearly has an advantage, because fine detail just seems to disappear on the Radeon X1900. Hopefully a future driver update will resolve this issue on the Radeon X1900.
 
Originally posted by: Wreckage
Originally posted by: 5150Joker
Originally posted by: Wreckage
Originally posted by: 5150Joker
Unfortunately my wife is the one stuck with nVidia's subpar drivers and image quality on her laptop using the nVidia 6800 Go Ultra.

I didn't know that "Extelleron" had a laptop.


LOL that's pretty good but you still didn't address the part where you claimed nVidia's TRAA was superior and I proved you wrong (as usual).

http://www.hothardware.com/viewarticle.aspx?page=6&articleid=777&cid=2

The ATI 6XAA vs. NVIDIA 8xS AA shots reveal similar differences, with NVIDIA having a clear edge in detail, especially in the trees where ATI's multi-sample only algorithm has minimal impact.

NVIDIA clearly has an advantage, because fine detail just seems to disappear on the Radeon X1900. Hopefully a future driver update will resolve this issue on the Radeon X1900.


8XSS is a pure super sampling mode with a huge performance impact and negligible IQ gains and they were comparing it to regular AA in your quote. That article is also outdated compared to the one I linked you to and the "detail loss" experienced in HL2 has been pointed out to you several time on these forums as being easily corrected. I'll link to it once more but I'm sure you'll continue with your blissful trolling: http://rage3d.com/board/showpost.php?p=1334251360&postcount=37
 
Originally posted by: 5150Joker
8XSS is a pure super sampling mode with a huge performance impact and negligible IQ gains over using adaptive AA. That article is also outdated compared to the one I linked you to. Nice try Trollage.

You asked for proof and I gave you proof.

I could spend all day proving your wrong but you will either say the article is outdated or from a biased site or some other BS. How about I just post a link to a story about denial and you can feel right at home.

Originally posted by: Jack Nicholson - A Few Good Men
You can't handle the truth!
 
Originally posted by: Wreckage
Originally posted by: 5150Joker
8XSS is a pure super sampling mode with a huge performance impact and negligible IQ gains over using adaptive AA. That article is also outdated compared to the one I linked you to. Nice try Trollage.

You asked for proof and I gave you proof.

I could spend all day proving your wrong but you will either say the article is outdated or from a biased site or some other BS. How about I just post a link to a story about denial and you can feel right at home.

Originally posted by: Jack Nicholson - A Few Good Men
You can't handle the truth!



You must be drunk or stupid (or both) because the obvious is still evading you: They were talking about REGULAR AA vs 8xSS, not TRSS vs AAA when you quoted the following:

The ATI 6XAA vs. NVIDIA 8xS AA shots reveal similar differences, with NVIDIA having a clear edge in detail, especially in the trees where ATI's multi-sample only algorithm has minimal impact.

As for this quote about HL 2 detail:
NVIDIA clearly has an advantage, because fine detail just seems to disappear on the Radeon X1900. Hopefully a future driver update will resolve this issue on the Radeon X1900.

See above. So far you have yet to provide any proof TRAA is any better than AAA. Bit-tech's article already proves you wrong - you must be used to that by now though.
 
Comparing NV's 8xAA to ATi's 6xAA is pretty silly. They are not the same type of AA. 8xAA is far from playable in anything new, at least to me. It puts a serious hit on frames. 6xAA is more than playable on most anything.
 
Originally posted by: Ackmed
Comparing NV's 8xAA to ATi's 6xAA is pretty silly. They are not the same type of AA. 8xAA is far from playable in anything new, at least to me. It puts a serious hit on frames. 6xAA is more than playable on most anything.


Trollage hardly ever has a valid argument and when he tries to present what he deems to be "facts" it ends up working against him.
 
I'm kinda curious why people say get the 7900GT if you want to oc and the X1900XT if you don't. Yes, the 7900GT ocs to some sweet levels, and overall the percentage increase you'll see is probably greater than what the X1900XT can generally achieve, but the X1900XT is by no means a slouch when it comes to ocing.
 
Originally posted by: Elfear
I'm kinda curious why people say get the 7900GT if you want to oc and the X1900XT if you don't. Yes, the 7900GT ocs to some sweet levels, and overall the percentage increase you'll see is probably greater than what the X1900XT can generally achieve, but the X1900XT is by no means a slouch when it comes to ocing.

I think the more valid argument is saving the $100

Overclocking is a hard number to gauge and can vary widely based on the card you buy.

The card he got from Dell is already factory overclocked and will perform better than anything at that price.
 
Originally posted by: Wreckage

I think the more valid argument is saving the $100

Overclocking is a hard number to gauge and can vary widely based on the card you buy.

The card he got from Dell is already factory overclocked and will perform better than anything at that price.

"at that price" being the key phrase here. I guess it really comes down to how much performance you want. IMO, $100 isn't a lot to pay for the performance increase of the X1900XT over the 7900GT, especially if the OP plays at 1680x1050.
 
Originally posted by: Wreckage
Originally posted by: Elfear
I'm kinda curious why people say get the 7900GT if you want to oc and the X1900XT if you don't. Yes, the 7900GT ocs to some sweet levels, and overall the percentage increase you'll see is probably greater than what the X1900XT can generally achieve, but the X1900XT is by no means a slouch when it comes to ocing.

I think the more valid argument is saving the $100

Overclocking is a hard number to gauge and can vary widely based on the card you buy.

The card he got from Dell is already factory overclocked and will perform better than anything at that price.

Anything apart from a X1800XT if he does not overclock.
As oblivion is an indicator of things to come, the X1900XT would be a far better option if you want to play for longer on the same card.
Not to mention if you are watching movies, this is a better card.
Also the 7900 is no way silent.

One thing you have to know is that the XT is a better card than the 7900 almost whichever way you look at it if you ignore the price. If you kept the 7900GT then you will inevitable end up thinking that you could have had a faster card for a little bit more money. And if your selling a card then you should sell the 7900GT as it is in short supply, not like the plentiful XT with great deals usually.
 
Originally posted by: Steelski
As oblivion is an indicator of things to come, the X1900XT would be a far better option if you want to play for longer on the same card.
Not to mention if you are watching movies, this is a better card.

I see so all games will be based off of Oblivion??? Wow, that's major news I wonder why no one else reported this :roll:

NVIDIA Purevideo > ATI beta AVIVO.

Originally posted by: Steelski
One thing you have to know is that the XT is a better card than the 7900 almost whichever way you look at it if you ignore the price.
If you ignore price??

Don't forget to ignore heat, power draw, warranty, drivers, linux support, Opengl peformance, stencil shadows, transparency AA, etc.

Why would you ignore all that......Oh wait I just read your sig nevermind :roll:
 
Originally posted by: Elfear
I'm kinda curious why people say get the 7900GT if you want to oc and the X1900XT if you don't. Yes, the 7900GT ocs to some sweet levels, and overall the percentage increase you'll see is probably greater than what the X1900XT can generally achieve, but the X1900XT is by no means a slouch when it comes to ocing.

overclocking = more value per $$$

7900gt = lower entry fee, same levels of performance generally.

same as why you would get a opty 165 vs a 170 or 175....if u plan to oc.

btw> did you get my message on the oblivion hdr + aa (& 8xaa) screenshots and description i requested? would really like to see first hand improvements without resorting to buying it myself...

 
Originally posted by: ST

overclocking = more value per $$$

7900gt = lower entry fee, same levels of performance generally.

same as why you would get a opty 165 vs a 170 or 175....if u plan to oc.

Lol. I know why people oc. I'm a big proponent of overclocking cheap parts to the level of expensive parts or beyond. My point was that people say get the X1900XT only if you don't want to oc. I beg to differ that point as the X1900XT ocs very well. Like I mentioned before, the 7900GT will generally get a higher percentage increase over an X1900XT, but the X1900XT will still be faster because you're starting out with a much faster card.


btw> did you get my message on the oblivion hdr + aa (& 8xaa) screenshots and description i requested? would really like to see first hand improvements without resorting to buying it myself...

I just barely bought Oblivion so if you can wait until after finals (two weeks) than I can get some screenshots for you. It would honestly be better if you have a buddy with an X1*** series card to show you the difference as screenshots very rarely do a game justice, but I will get some for you if you'd like.

ATI cards don't do 8xAA. 😉 But I'd be happy to post some screenshots with 6xAA enabled.
 
Originally posted by: 5150Joker
Originally posted by: Wreckage
Originally posted by: 5150Joker
8XSS is a pure super sampling mode with a huge performance impact and negligible IQ gains over using adaptive AA. That article is also outdated compared to the one I linked you to. Nice try Trollage.

You asked for proof and I gave you proof.

I could spend all day proving your wrong but you will either say the article is outdated or from a biased site or some other BS. How about I just post a link to a story about denial and you can feel right at home.

Originally posted by: Jack Nicholson - A Few Good Men
You can't handle the truth!



You must be drunk or stupid (or both) because the obvious is still evading you: They were talking about REGULAR AA vs 8xSS, not TRSS vs AAA when you quoted the following:

The ATI 6XAA vs. NVIDIA 8xS AA shots reveal similar differences, with NVIDIA having a clear edge in detail, especially in the trees where ATI's multi-sample only algorithm has minimal impact.

As for this quote about HL 2 detail:
NVIDIA clearly has an advantage, because fine detail just seems to disappear on the Radeon X1900. Hopefully a future driver update will resolve this issue on the Radeon X1900.

See above. So far you have yet to provide any proof TRAA is any better than AAA. Bit-tech's article already proves you wrong - you must be used to that by now though.

Any answer to this Mr.W?? I think you conveniently skipped 5150s post.
 
Originally posted by: Elfear
Originally posted by: Wreckage

I think the more valid argument is saving the $100

Overclocking is a hard number to gauge and can vary widely based on the card you buy.

The card he got from Dell is already factory overclocked and will perform better than anything at that price.

"at that price" being the key phrase here. I guess it really comes down to how much performance you want. IMO, $100 isn't a lot to pay for the performance increase of the X1900XT over the 7900GT, especially if the OP plays at 1680x1050.


Id still take a X1800XT. Even cheaper than any 7900GT, and generally faster. Not to mention HDR+AA, HQ AF, etc.
 
Originally posted by: Wreckage
Originally posted by: Steelski
As oblivion is an indicator of things to come, the X1900XT would be a far better option if you want to play for longer on the same card.
Not to mention if you are watching movies, this is a better card.

I see so all games will be based off of Oblivion??? Wow, that's major news I wonder why no one else reported this :roll:

NVIDIA Purevideo > ATI beta AVIVO.

Originally posted by: Steelski
One thing you have to know is that the XT is a better card than the 7900 almost whichever way you look at it if you ignore the price.
If you ignore price??

Don't forget to ignore heat, power draw, warranty, drivers, linux support, Opengl peformance, stencil shadows, transparency AA, etc.

Why would you ignore all that......Oh wait I just read your sig nevermind :roll:

you are a desperate man...!!!!!! your FUD is so rubbish.

Heat. could be an issue.
Power draw, not likley.
Warranty, non issue as if both cards were kept stock, you know who would win!!!!!!!
Linux, I dont give a sh\t!!!!!!
Open GL, maybee. but not likley, Doom and Q4 are both maxed out esily. with AF and AA on Nvidia and ATI.
Stencil shadows. it helps i suppose.
Transparency AA, same as AAA.

Your arrow is pointing the wrong way in the AVIVO vs Purescam

As for Oblivion,..... you know very well that most reviewers usually say that ATI is likley to have an advantage in games that are like Oblivion. (meaning most newer challenging games).

My sig i thought was funny, and is meant to provoke you.. I actually thought of you when i was writing it.

If Nvidia had better cards then i would be behind them. But the fact that they have buggy drivers and as many people always point out themselves no driver release seems to have any consistency involved. The fact that their IQ is not on par with ATI simply does it for me.
In any forum people could say, "ati has better image quality than Nvidia" and people would argue..... but if someone says that Nvidia has better IQ than ATI, they get laughed at , thats says something. Nvidia on the diffensive there and never on the offensive.....
You do know thats because the Nvidia argument in that area is very weak. and not even fanboys belive that.

You are simply an Nvidia HO!
 
Originally posted by: Steelski
My sig i thought was funny, and is meant to provoke you.. I actually thought of you when i was writing it.

How sweet that you think of me :heart:

As for the rest of your post, it was pure biased opinion at best and troll poo at its worst. No response will make you happy and you are set in your ways so it aint worth my time.
 
Definitely keep the X1900XT.

Comparing NV's 8xAA to ATi's 6xAA is pretty silly. They are not the same type of AA.
I agree - 8xAA offers much better image quality on alpha textures.

ATi can't even do SSAA in OpenGL unless you run a Crossfire setup; meanwhile any single nVidia card can do 16xAA and it's surprisingly playable in many older games. Also the image quality has to be seen to be believed and it looks much better than 6x adaptive AA.
 
http://www.tech-hounds.com/review18/ReviewsComplete.html

It boils down to this. ATI's adaptive antialiasing is somewhat a mixed bag. In Performance mode, the image quality it offers is better than NVIDIA under the same setting. However, this is not the preferred mode of use. In Quality mode, which corresponds to supersampling on the GeForce 7 series, the Radeon adaptive antialiasing is not as effective. It only works well on objects that are near to the camera. There's still very noticeable aliasing (more so in motion) on faraway objects with transparent textures. Gamers looking for the best image quality in games with transparent textures, should be more happy with the GeForce 7 than the Radeon X1900
 
Back
Top