• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

7900GT or X1900 GT?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MegaWorks

Diamond Member
Jan 26, 2004
3,819
1
0
Originally posted by: Ulfhednar
X1800 XT or X1900 GT easily, don't even consider a 7900 GT with the insanely low reliability of those pieces of junk.

The 7900GT is not junk, otherwise provide link to back it your claim.

Stop spreading junk. :p
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: keysplayr2003

It just makes sense. No reaching. No reason to. It's staring you and I and everyone here in the face. ATI superior image quality is a lie. May even go as far to say driver cheats, (but you didn't hear that from me).

Doing more work, thats why they take the performance hit? No, its because the ops are turn off, at least some of them.

Wow, you tihnk NV has better overall IQ, and that ATi drivers may be cheating? I would expect that from others, not from you. Where are the reviews to back this up? All reviews I have seen, show ATi with better IQ, when looking at all driver settings.

What FS said for CoD2;
The best image quality, bar none, is delivered by ATI in its 6xAA 16xAF mode with Adaptive AA turned on. It finally gets rid of the jaggies along the telephone cable, does the best job of all with the fence. The only areas where it lags somewhat behind NVIDIA is in the stone pattern behind to the left of the crates, and the mortar and brickwork to the left of the closer telephone pole.

HL2;
This is a tough one. On the one hand, ATI?s fence is remarkably smooth and free of missing pixels or jaggies. On the other, it is even more faded now. Overall though, advantage ATI on the fence. When it comes to the trees, despite the move to 6xAA, the ATI product still cannot render as many of the thin branches as the 7900GTX. What is show, individually, tends to look better on a branch-by-branch basis, but the extra fullness of the tree in the GeForce pictures is tempting. We?re going to mark this test as a wash. It?s not that either card is better, but we expect personal preferences regarding the trees and fence are going to be especially important.

So they say that ATi has better CoD2 IQ, and that HL2 is a wash. And how out that, you think NV has better IQ in CoD2, HL2, and overall? Not to mention the fact, that they used gamma corrected AA for NV... and didnt use HQ AF for ATi?

And then their conclusion;
The image quality comparison is almost certainly a draw on all but one account. NVIDIA?s AA routine, especially with Transparency AA activated, is better at drawing very fine lines, like those in the Half-Life 2 fence screenshot. Whether it?s the thin left side of the fence or the small branches on the trees, NVIDIA shows you more, and even more importantly, what they show is in sharper contrast than what ATI?s routines deliver. However, ATI generally has better AA smoothing once you bump things up to 6xAA. I?m going to call it a wash here.

NVIDIA?s anisotropic filtering looks better in screenshots. You?ll remember that you can see the cobblestone pattern in the Call of Duty 2 screens far beyond where the ATI image blurs them into flat ground, but this comes at a steep price. NVIDIA?s optimizations create the shimmering effect we saw in the Battlefield 2 video, which can range from unnoticeable to distracting, depending on the game, the scene, and how sensitive you are to it. Also, the optimizations produce a sharper point of delineation where the card switches from low detail to high detail textures, creating clear steps of detail change relative to the smooth transition of ATI?s upcoming hardware. ATI?s own optimizations aren?t without fault either, as some users have reported shimmering with ATI?s latest cards as well, but as you can see in the videos, it isn?t nearly as pronounced, even in our scenario outlined with Battlefield 2.

http://www.firingsquad.com/hardware/ati...ge_quality_showdown_august06/page6.asp

So, they call AA a wash. Keep in mind they used gamma corrected AA too. And say that in still shots, NV's AF looks better. Then say that NV has more pronouneced shimmering than ATi, due to their AF. Keep in mind, no HQ AF for ATi. How do you get that NV has better IQ out of that? Especially when they used gamma corrected AA for NV, and no HQ AF for ATi? Then they saw, and the videos clearly show, shimmering is much worse for NV, than ATi?

What do other reviews have to say about this?
We can?t say enough about the great image quality produced by the ATI Radeon X1900/X1950 series. Just at its base level filtering quality seems to be better compared to NVIDIA?s default driver settings. There is less noticeable texture crawling and moiré. Though in some games the moiré is equally disturbing like Prey, this seems to simply be a game problem. Perhaps we need an anti-moire setting eh? The ability for ATI video cards to enable high quality anisotropic filtering which helps texture quality at steep angles is another benefit they currently have. It provides real tangible results.

All ATI X1000 series video cards also support HDR + AA. We have seen where this is very usable in Oblivion. The new Radeon X1950 XTX allows 2X AA with HDR and very high in-game settings for a very immersive and enjoyable gaming experience. With faster performance overall the Radeon X1950 XTX allows you to increase some visual quality settings further improving the gameplay experience.

http://enthusiast.hardocp.com/article.html?art=MTE0NCwxMSwsaGVudGh1c2lhc3Q=

We took an in-depth look at image quality in this evaluation. We had two 30? LCD?s side-by-side and were able to look at IQ in-game. Culminating everything we learned we can confidently say that the ATI Radeon X1900 XTX CrossFire platform offered the best image quality in games.

We have all the proof to back this up. With the Radeon X1900 XTX and CrossFire platform antialiasing plus HDR is possible in Oblivion. ATI has the ability to do multisampling plus floating point blending HDR. ATI has a ?High Quality? anisotropic filtering option which has real tangible image quality benefits in games with little to no performance hit. In large outdoor games this is a huge benefit. It also helps in games that have very high quality and detailed textures like Half Life 2: Episode 1 and Ghost Recon. Having these two displays side-by-side proved that texture crawling and moiré are worse on NVIDIA hardware at default driver settings compared to ATI hardware. We especially noticed this in World of Warcraft, Half Life 2: Episode 1, and Battlefield 2. When you look at all this added up it becomes clear that ATI still holds the image quality crown.

http://enthusiast.hardocp.com/article.html?art=MTA4MywxNiwsaGVudGh1c2lhc3Q=

Is there any review that says NV has better IQ? Especially with options turned up for both cards?

Originally posted by: Pabster
Ackmed, it's about time you give it up. And post your ATI badge number while you're at it.

First you say IQ is subjective (which it is) then you say "lets not pretend that the IQ is the same..."

I've owned literally tens of both nVidia and ATi over the years. I've owned recently 7800GT, 7800GTX, 7900GTX, and X1900XTX. There is very little IQ difference. I could point out (subjectively) an area where either card disappoints me.

PS I see you are selling your X1900XTX. Planning an nVidia upgrade? :D :p

Its time for you to get a new line, that one is over used, and pretty sad. IQ is subjective, to a degree. Some parts of IQ is not subjective, such as better clarity with AF.

No, Im not planning on a NV upgrade. Im not selling a XTX, I dont own one. Im selling my X1900XT CF Master card.

 

Pabster

Lifer
Apr 15, 2001
16,986
1
0
Originally posted by: Ackmed
Its time for you to get a new line, that one is over used, and pretty sad. IQ is subjective, to a degree. Some parts of IQ is not subjective, such as better clarity with AF.

Ah, right. And which great benchmarking tool provides this measure of "clarity" with AF? The fact is, ALL image quality measurements are subjective. Period. And since everyone has a different set of eyes, this can vary wildly. So stop spreading your FUD.

No, Im not planning on a NV upgrade. Im not selling a XTX, I dont own one. Im selling my X1900XT CF Master card.

As much ATI pimping as you do, I'm surprised.

 

MegaWorks

Diamond Member
Jan 26, 2004
3,819
1
0
Ackmed what's your system spec for the record?

And are you planning on buying a X1950XTX after you sell your X1900 CF master card?

Oh! very nice reviews find to back your claim! :)
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Pabster
Originally posted by: Ackmed
Its time for you to get a new line, that one is over used, and pretty sad. IQ is subjective, to a degree. Some parts of IQ is not subjective, such as better clarity with AF.

Ah, right. And which great benchmarking tool provides this measure of "clarity" with AF? The fact is, ALL image quality measurements are subjective. Period. And since everyone has a different set of eyes, this can vary wildly. So stop spreading your FUD.

No, Im not planning on a NV upgrade. Im not selling a XTX, I dont own one. Im selling my X1900XT CF Master card.

As much ATI pimping as you do, I'm surprised.

Sorry, not every aspect of IQ is subjective. HQ AF is noticable better. Saying its not more clear, is just wrong. There are plenty of comparison shots, feel free to look. A recent on in a 1950XTX review showed in Oblivion, the clear advantage of HQ AF. Ive forgotten which one it is, HardOCP shows it in HL2. ATi's HQ AF is better than NV's AF, if people can tell the difference is another discussion. Are you trying to claim that the videos of shimmering from FS, dont show a clear advantage to ATi? I guess a 5MP camera looks the same as a 10MP camera to you too?

I "pimp" what I think is better. There are zero reviews that claim NV has better IQ that I have seen. Yet dozens of them say that ATi's IQ is better.

Originally posted by: MegaWorks
Ackmed what's your system spec for the record?

And are you planning on buying a X1950XTX after you sell your X1900 CF master card?

Oh! very nice reviews find to back your claim! :)

Umm... not sure why it matters, but here it is;
Lian Li PC-v2000
PC P&C 510 SLI
Asus A8R32-MVP
Opty 165@2.65/Zalman 9500LED
2x1gig Crucial Ballistix PC4000
X-Fi
150gig Raptor
300gig Maxtor 16meg cache
200gig Maxtor 8meg cache
Sony&BenQ dual later 16x DVD burners
Dell 2405FPW 24" LCD
Logitech Z-5500 5.1 speaker
Logitech G15 LCD keyboard
Logitech G5 mouse

No, I wont be buying any new card for a while. First time since the 9700 Pro, I havent gotten the newest version of either an ATi, or NV card. Im going to be selling off my whole PC pretty soon, more than likely. Perhaps not the LCD/Speakers/kb/mouse, but just the box itself.

 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
I like some things about CRT's, and some things about LCD's. Overall, I like LCD's better. And shimmering for ATi is much lower than NV's, so its not the same issue it was.

Although as I said above, I may sell my LCD. Was thinking of about $550...
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
So if HardOCP has "all the proof they need" why then didn't they show what firing squad has shown? If they had, I'm sure they could not have come to their final conclusion now could they. Bottom line, they didn't look deep enough obviously.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Ackmed

What FS said for CoD2;

quote:
The best image quality, bar none, is delivered by ATI in its 6xAA 16xAF mode with Adaptive AA turned on. It finally gets rid of the jaggies along the telephone cable, does the best job of all with the fence. The only areas where it lags somewhat behind NVIDIA is in the stone pattern behind to the left of the crates, and the mortar and brickwork to the left of the closer telephone pole.

Would you care to include the remaining comments on what firing squad said about CoD2?
Why do you keep ignoring these findings? Oh, duh... n/m.

 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: keysplayr2003
How is that GX2 working out for....... Oh wait.. You never intended to get one.
keyplayer2004, you know absolutely nothing about me and my purchasing decisions are not the business of a pathetic little troll like yourself.

For your information though, I was considering a 7950GX2, among other cards, as I broke my X1800XT, but I was simply waiting for the release of the X1950XTX before I made my final decision.

The X1950XTX did not live up to the hype at all, so that took me back to two choices; 7950GX2 or X1900XT Crossfire. My decision has been made as follows: -

As the native resolution of my current monitor is 1440x900, I do not need SLI or Crossfire, so I decided that both of those options are overkill. Thus, rather than buy a 7950GX2 for nearly £400, I bought an X1900 XT for £200 and will add an X1900 Crossfire Edition if I upgrade my monitor.

It's the logical choice with the monitor I have, this saves money and gives me far superior image quality, and I hope my purchase has your approval keysplayr2003 otherwise I will have to cancel my order immediately! :Q God forbid my graphics purchase not be approved by keysplayr2003, 7950GX2 fanboy of the year, I hope you don't lose any sleep because I didn't buy one.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Ulfhednar
Originally posted by: keysplayr2003
How is that GX2 working out for....... Oh wait.. You never intended to get one.
keyplayer2004, you know absolutely nothing about me and my purchasing decisions are not the business of a pathetic little troll like yourself.

For your information though, I was considering a 7950GX2, among other cards, as I broke my X1800XT, but I was simply waiting for the release of the X1950XTX before I made my final decision.

The X1950XTX did not live up to the hype at all, so that took me back to two choices; 7950GX2 or X1900XT Crossfire. My decision has been made as follows: -

As the native resolution of my current monitor is 1440x900, I do not need SLI or Crossfire, so I decided that both of those options are overkill. Thus, rather than buy a 7950GX2 for nearly £400, I bought an X1900 XT for £200 and will add an X1900 Crossfire Edition if I upgrade my monitor.

It's the logical choice with the monitor I have, this saves money and gives me far superior image quality, and I hope my purchase has your approval keysplayr2003 otherwise I will have to cancel my order immediately! :Q God forbid my graphics purchase not be approved by keysplayr2003, 7950GX2 fanboy of the year, I hope you don't lose any sleep because I didn't buy one.

Oh I think I know all that there needs to be known about "Ulfhednar". Now go over to your fish tank, you know, the one with all the red-herrings in it. The fish food is to the right of the tank in a cylindrical container marked "Fallicies". You can't miss it. I heard it's relaxing to watch swimming fish and very therapeutic.

 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: keysplayr2003


Would you care to include the remaining comments on what firing squad said about CoD2?
Why do you keep ignoring these findings? Oh, duh... n/m.

I posted the last paragraph of the CoD2 page, or the summary. Same as I did for the HL2 page. Then included the two paragraphs in the articles conclusion. One for AA, and the other for AF. Whats the problem with that? Seems pretty logical to me.

How does the FS article back you up in this post? How is ATis IQ quality a lie? They didnt even enable HQ AF, or try HDR+AA. While they did use gamma corrected AA for NV. The article favors ATi, not NV. Especially when backed up by other reviews that say ATi has better IQ as well.

By far NV cards are doing more work than ATI cards. Like I said many times before this FS article, I suspected this was the case. The FS article just reinforces it. I'm not reaching here at all Ackmed and it's silly of you to say so when I am merely directing your attention to something you will not acknowledge. I was actually very pleased to see that my suspicions weren't without merit, if only for the fact that I wasn't completely incorrect.

It just makes sense. No reaching. No reason to. It's staring you and I and everyone here in the face. ATI superior image quality is a lie. May even go as far to say driver cheats, (but you didn't hear that from me).

Im sorry to say, thats probably the worst post from you I have ever seen. You suggest ATi cheats with their drivers, and that NV's IQ is as good as ATi's? Find one review that agrees with you.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
certain things are indeed subjective. there is a slight difference in color between ati and nv, and which one prefers can easily be a matter of personal taste or how their eyes perceive color.

some things certainly aren't however, such as the clearly superior AF filtering ati x1k card offer. this extends into "shimmering", which has as much to do with transitions in texture filtering as anything else. this can be lessened on the nv card (at the cost of performance). while both cards display this effect to some degree, and it does vary from one game to the next, nv suffers this condition more than ati. to claim there's no difference is simply stating one has no objectivity in the matter.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Ackmed

Im sorry to say, thats probably the worst post from you I have ever seen. You suggest ATi cheats with their drivers, and that NV's IQ is as good as ATi's? Find one review that agrees with you.

I think the FiringSquad article backs that statement up.

I'm not sure why this upsets you? You act like ATI is perfect and anyone who says otherwise is personally attacking you.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0

Originally posted by: keysplayr2003
By far NV cards are doing more work than ATI cards. Like I said many times before this FS article, I suspected this was the case. The FS article just reinforces it. I'm not reaching here at all Ackmed and it's silly of you to say so when I am merely directing your attention to something you will not acknowledge. I was actually very pleased to see that my suspicions weren't without merit, if only for the fact that I wasn't completely incorrect.

It just makes sense. No reaching. No reason to. It's staring you and I and everyone here in the face. ATI superior image quality is a lie. May even go as far to say driver cheats, (but you didn't hear that from me).

?

how in the hell do you come up with that conclusion? that's by far one of the most ignorant statements i've seen made on this forum. is this the real keys?



 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Wreckage
Originally posted by: Ackmed

Im sorry to say, thats probably the worst post from you I have ever seen. You suggest ATi cheats with their drivers, and that NV's IQ is as good as ATi's? Find one review that agrees with you.

I think the FiringSquad article backs that statement up.

aside from the fact the FS article does not back that up whatsoever, they failed to even use several of ati's features (brandon later posted he planned on a follwup article, tho to me there was no login in completely omitting it in the first place).

 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
aside from the fact the FS article does not back that up whatsoever, they failed to even use several of ati's features
He's right then to say that Nvidia may have been doing more work than ATI in those screenshots, but that isn't ATI fault the reviewers didn't enable more of their features like keys's comments suggest.

As far as the rendering distance, it isn't the same when you enable all of ATI's image enhancing features. And there are times when the game has a config file that contorls the rendering distance as well.

Nvidia's AA does incorporate more detail in their review and we also have another member here who has done those same kinds of comarisons, only he knew what he was doing. Has everyone forgotten nitromullet's screenshots that accurately compared both AA and AF? We saw the same disappearence in the alpha texture lines that FS showed, yet the AF was maxed for both sides as well. Nvidia's AF had gaps at it's distances that resulted in the form of small white horizontal lines. Why is everyone yelling at eachother due to this review? We've already had more valid ones where the tester is readily available.

Ackmed does tend to see ATI as flawless sometimes I'll admit, but he is right here. Several other sites conclude that overall (meaning maybe not in everything but most of the time) ATI's image quality exceeds Nvidia's. With HL2, it could be considered a "wash", yet with Oblivion Nvidia wouldn't stand a chance when all of the games features and both driver settings are utilized fully. It depends on the game sometimes as well.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
they didnt even test 8x AA for nvidia cards, you'd better hope that 6x AA looks better than 4x :confused:
Isn't 8xAA the same as 4xAA with supersampled TrAA? IIRC that's why 8xAA is actually 8xSAA.
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: keysplayr2003
Oh I think I know all that there needs to be known about "Ulfhednar". Now go over to your fish tank, you know, the one with all the red-herrings in it. The fish food is to the right of the tank in a cylindrical container marked "Fallicies". You can't miss it. I heard it's relaxing to watch swimming fish and very therapeutic.
Translation: "The fact you chose an X1900XT and the possibility of X1900 Crossfire burns me down to my very soul because you didn't choose to buy my much-beloved 7950GX2. Since I have no possible reason that you might reconsider your purchase, and since I am really lame at coming up with insults, I will just blag this crap instead."

:confused:
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: josh6079
they didnt even test 8x AA for nvidia cards, you'd better hope that 6x AA looks better than 4x :confused:
Isn't 8xAA the same as 4xAA with supersampled TrAA? IIRC that's why 8xAA is actually 8xSAA.

im pretty sure 8xS and 4xMS + TRSS are different, but i might be wrong :eek:
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Wreckage
Originally posted by: Ackmed

Im sorry to say, thats probably the worst post from you I have ever seen. You suggest ATi cheats with their drivers, and that NV's IQ is as good as ATi's? Find one review that agrees with you.

I think the FiringSquad article backs that statement up.

I'm not sure why this upsets you? You act like ATI is perfect and anyone who says otherwise is personally attacking you.

Care to show me where the FS article backs up NV having better IQ, or that ATi is cheating? From the quotes I dropped, they give ATi the nod. All the while, not even using settings to further ATi's IQ, such as HQ AF. Yet they did with NV, such as with gamma corrected AA.

Please feel free to explain how the (incomplete) FS article backs up either of his statements. He says that ATi having better IQ is a lie. Ive posted linked that show HardOCP believe ATi has the clear advantage. I know he has an account over there, perhaps he should go tell Brent and Kyle that they are liars, and ATi doesnt have the IQ advantage? Somehow, I doubt that will happen.

Originally posted by: schneiderguy
Originally posted by: Ackmed

The best image quality, bar none, is delivered by ATI in its 6xAA 16xAF mode with Adaptive AA turned on.

they didnt even test 8x AA for nvidia cards, you'd better hope that 6x AA looks better than 4x :confused:

True, they didnt. Probably because it would have absolutely killed frames. To the point where it wouldnt be playable. NV's 8xAA is very nice, no doubt about that, in older games when you can get playable frames. 8xAA also doesnt help with every object either. But I do hope they use it for the second part. But since they didnt show any FPS numbers, it would be pretty misleading.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Ulfhednar
and since I am really lame at coming up with insults, I will just blag this crap instead."

:confused:

When you can't win on logic, get personal. Just ignore it. This site has really cleaned up, but I see we still have our moments. Personally I am happy with my x1900gt (as I won it, so the price was right) and I am sure if it was a 7900gt I would be happy with that also. :music:

 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
True, they didnt. Probably because it would have absolutely killed frames. To the point where it wouldnt be playable. NV's 8xAA is very nice, no doubt about that, in older games when you can get playable frames. 8xAA also doesnt help with every object either. But I do hope they use it for the second part. But since they didnt show any FPS numbers, it would be pretty misleading.
If they're just reviewing image quality, all they need is one frame. FPS's don't matter if the material being reviewed is a picture. If they are doing a relative analysis of image quality per performance, then I could see the need for them to show FPS numbers.

They did not set the Nvidia card's to 8xAA but I don't see how they needed to. Nvidia has more accurate AA in some applications. Period. However, ATI's 6xAA can be just as streamlined if not more so than Nvidia's in certain areas. Then if the question of HDR becomes relative, ATI's is the only AA that can exist with FP16HDR (the kind used in all but Source games). With the AA issue, it is a toss-up. Nvidia has the edge in TrAA, ATI in HDR+AA. I can see reasonings for both. While I love HDR+AA, I equally love the ability to saw through fences in BF2 and know what and where exactly my bullets go.

Once we begin to look at AF, the choice is a little more clear cut IMO. Angle independent vs. angle dependent. Every game can visually benefit from being rendered with an angle independent AF.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
of course fps matters.. it makes absolutely no difference how good it looks if you can't use it -- unless of course you purchase high end video cards to display static images.