7950GX2 and X1900XTX screenshot comparison

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Todd33
I love the black and white responses, "7950gx2 looks like crap compared to the xtx". They look 98% the same in SS and 99.5% the same in action. Pick whichever card performs the best in the games you play.

Originally posted by: Genx87
If you can really see a difference in those shots you have better eyes than me.

Or you want to believe.

/shrug

Originally posted by: Gamingphreek
Excellent review!!! I guess both sides have their de facto strengths.

Nvidia hands down wins in AA (Albeit with a much larger performance hit as it is SS instead of MS)

ATI hands down wins in AF (THe Angle Independent AF is really really nice); couple that with the HDR+AA support and it becomes more noticable.

I dont think this really brought out anything that people didn't know. It was generally accepted that ATI had the edge in IQ. What this did do was prove fanboys wrong on both sides. THe differences are not that drastic and further reinforces the fact that you anyone who said they couldn't stand something from one company or another is BSing. Neither sides downsides are large enough to warrant a change in cards.

-Kevin

Since IQ is basically equal between the cards, as this review proves and the vast majority of the unbiased members here have said.

I dont really care about ATIs disappearing fence trick.
But the overexaggeration on ATI Bloom and HDR in Oblivion will burn out a cornea.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: RobertR1
Catalyst AI has no effect on IQ unlike Quality and HQ setting on nvidia cards. This is not a flame, just the way it is.

Nitro, would you mind retaking those oblivion screen shots in HDR/4xAA with the xtx. I'm curious as if it's just the Bloom effect that is exaggerated on the Ati cards or HDR also.

There are screenshots for both bloom and HDR already.

You have to realize that the character has an illumination spell cast in those shots, which is why you see a glow around them. It's supposed to be there. I didn't realize that that it would look like that on the XTX, as I took the GX2 shots first and didn't even notice the spell was cast. However, once I had the XTX back in I wanted to keep the conditions as identical as possible, even though the effect was a bit overdone.

edit: Maybe this evening I'll retake the XTX shots without the illumination spell cast.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: nitromullet
Originally posted by: ArchAngel777
Actually, you were not who I had in mind when I posted that, but I guess you would fall into that category after this post :p

Do the differences really subtract from the gameplay though? I mean, if you didn't compare the two actively, would you have really noticed a difference when running through the game? I have my doubts on this. But if so, then it just proves that IQ isn't a moot point for you and others who feel the same way. In this situation, I guess you need to go with what you like the best. I just never spot differences, or if I do, they never really bother me unless they are really bad (Like Far Cry looks when using 1.0 retail release, textures are distorted, bright, washed out, etc... Looks like crap) that kind of IQ would really bother me, but that would be broken IQ, IMO.
Well, I do notice things like texture detail, flicker, and aliased edges when I play a game even when I'm not looking for it. Things like that kind of jump out at me. It doesn't ruin the game for me, provided it's a good game, but I do notice it.


No doubt, as do I. But my question is this: If you played a game through on an nVidia card with 4X AA and 16X AF, and then then next day played the game on an ATI card, provided the colors were set the same, would you be able to stop and say "here are the differences?" Granted it may be possible in some games, especially with AF filtering, but I think generally you won't notice a difference.

Even with the screenshots, I have to compare them side my side, litterally to find the difference. I suppose my attention to detail may be lacking, but I suppose that is to my benefit, really :D
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Nitro, either you or HardOCP is doing something wrong with the HDR shots because this is what they had to say in their 7950 review:

We did not notice any differences what-so-ever between the platforms as far as shaders go. We looked very closely at HDR quality between NVIDIA and ATI in GRAW, Half Life 2: Episode 1 and Oblivion. Both platforms had the same colors, the same intensity, the same tone, the same glare and shine. There were no differences to the naked eye.

Anyhow, in your Oblivion shots it looks like the nVidia card doesn't even have HDR enabled. As for AA quality, they had this to say:

When we compared ATI?s AA and NVIDIA?s AA with their DEFAULT settings we still found ATI to provide better image quality. However, once we enabled ?Gamma Correct AA? on the NVIDIA GPU based video cards this changed. Now with our naked eye looking at games like Half Life 2: Episode 1, FEAR, and World of Warcraft we did not notice any differences between the cards at 2X or 4X AA. This was side-by-side, the same scenes (using many different saved points in HL2: EP1) and we saw no difference with our own eyes in-game.

Now, we looked into this further by taking screenshots and zooming into them in Photoshop at 300x their normal size. What we found is still at 2X AA ATI is ever so slightly better, the colors just seem more blended and less ?harsh? than NVIDIA has. However, at 4X AA they looked damn near identical.

So to gamers who are not going to be zooming into static images in Photoshop but are just going to be playing the game AA image quality looks the same in-game. Just make sure you enable ?Gamma Correct AA? from the driver control panel.

Yes nVidia has higher AA modes for single cards but that doesn't mean they have better AA.
 

drum

Diamond Member
Feb 1, 2003
6,810
4
81
looks to me like you're going to have a marvelous gaming experience either way you go
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Nitro, either you or HardOCP is doing something wrong with the HDR shots because this is what they had to say in their 7950 review:

We did not notice any differences what-so-ever between the platforms as far as shaders go. We looked very closely at HDR quality between NVIDIA and ATI in GRAW, Half Life 2: Episode 1 and Oblivion. Both platforms had the same colors, the same intensity, the same tone, the same glare and shine. There were no differences to the naked eye.

Edit: NM just noticed you never took any HDR shots with the GX2. Still even the bloom shots look like it's lacking bloom.

As for AA quality, they had this to say:

When we compared ATI?s AA and NVIDIA?s AA with their DEFAULT settings we still found ATI to provide better image quality. However, once we enabled ?Gamma Correct AA? on the NVIDIA GPU based video cards this changed. Now with our naked eye looking at games like Half Life 2: Episode 1, FEAR, and World of Warcraft we did not notice any differences between the cards at 2X or 4X AA. This was side-by-side, the same scenes (using many different saved points in HL2: EP1) and we saw no difference with our own eyes in-game.

Now, we looked into this further by taking screenshots and zooming into them in Photoshop at 300x their normal size. What we found is still at 2X AA ATI is ever so slightly better, the colors just seem more blended and less ?harsh? than NVIDIA has. However, at 4X AA they looked damn near identical.

So to gamers who are not going to be zooming into static images in Photoshop but are just going to be playing the game AA image quality looks the same in-game. Just make sure you enable ?Gamma Correct AA? from the driver control panel.

Yes nVidia has higher AA modes for single cards but that doesn't mean they have better AA (e.g. compared at the same setting).
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
ArchAngel777, Where are you going with this?
Originally posted by: ArchAngel777
Do the differences really subtract from the gameplay though?
Originally posted by: nitromullet
It doesn't ruin the game for me, provided it's a good game, but I do notice it.
Question asked and answered.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: 5150Joker
Nitro, either you or HardOCP is doing something wrong with the HDR shots because this is what they had to say in their 7950 review:

We did not notice any differences what-so-ever between the platforms as far as shaders go. We looked very closely at HDR quality between NVIDIA and ATI in GRAW, Half Life 2: Episode 1 and Oblivion. Both platforms had the same colors, the same intensity, the same tone, the same glare and shine. There were no differences to the naked eye.

Edit: NM just noticed you never took any HDR shots with the GX2. Still even the bloom shots look like it's lacking bloom.

As for AA quality, they had this to say:

When we compared ATI?s AA and NVIDIA?s AA with their DEFAULT settings we still found ATI to provide better image quality. However, once we enabled ?Gamma Correct AA? on the NVIDIA GPU based video cards this changed. Now with our naked eye looking at games like Half Life 2: Episode 1, FEAR, and World of Warcraft we did not notice any differences between the cards at 2X or 4X AA. This was side-by-side, the same scenes (using many different saved points in HL2: EP1) and we saw no difference with our own eyes in-game.

Now, we looked into this further by taking screenshots and zooming into them in Photoshop at 300x their normal size. What we found is still at 2X AA ATI is ever so slightly better, the colors just seem more blended and less ?harsh? than NVIDIA has. However, at 4X AA they looked damn near identical.

So to gamers who are not going to be zooming into static images in Photoshop but are just going to be playing the game AA image quality looks the same in-game. Just make sure you enable ?Gamma Correct AA? from the driver control panel.

Yes nVidia has higher AA modes for single cards but that doesn't mean they have better AA (e.g. compared at the same setting).
Better is a subjective term in this case. If I can run 8xAA smoothly, I'd rather be able to do so. In HL2, I'm sure the XTX could handle 8xAA, but you're limited to 6xAA. IMO, that gives NV the an in AA where 8x is possible. With a card like the GX2, it is possible in a number of games. Only Oblivion required me to reduce the AA out of the games I looked at.

As far as HardOCP goes:

It's interesting that they would be doing an "apples-to-apples" comparison all of a sudden, since they pretty much never do that. From their testing standpoint, the max possible playable settings have always been their thing.

HardOCP had difficulty getting the screenshots at the same time of day in Oblivion... All you have to do is hit "T", and you can wait in the same spot for as long as you want (it basically moves time forward very quickly) - seems like a useful feature when testing image quality.

HardOCP also took screenshots of the chainlink from a different location in HL2, and I don't see the fence disappearing with the Radeon, but on my PC they do.

I'm not saying I don't trust HardOCP, but just that this are things that they missed or overlooked in their examination. Also, after having posted just 4 screens from 3 different games, it gives me much more of an appreciation for what these guys do. This takes a lot more time/effort than I expected.

Originally posted by: beggerking
Hi Nitro:

did you notice any shimmering with your 7950gx2?
Yes I did, mostly in WoW. That being said, the XTX also shimmers in WoW, although not as much.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: nitromullet
Where are you going with this?
Originally posted by: ArchAngel777
Do the differences really subtract from the gameplay though?
Originally posted by: nitromullet
It doesn't ruin the game for me, provided it's a good game, but I do notice it.
Question asked and answered.


Yes, that is why in my next post I asked another question, or rather, more or less clarified it. Read up a few posts.

But to expound, you said you notice things like AA and AF or what not, but that would be comparing it to games with no AF or AA. Talking apples and apples with both systems comparing with similar settings, would you be able to tell the difference without a side by side shot comparison? Does my question make sense?
 

Praxis1452

Platinum Member
Jan 31, 2006
2,197
0
0
Originally posted by: Crusader
Originally posted by: Todd33
I love the black and white responses, "7950gx2 looks like crap compared to the xtx". They look 98% the same in SS and 99.5% the same in action. Pick whichever card performs the best in the games you play.

Originally posted by: Genx87
If you can really see a difference in those shots you have better eyes than me.

Or you want to believe.

/shrug

Originally posted by: Gamingphreek
Excellent review!!! I guess both sides have their de facto strengths.

Nvidia hands down wins in AA (Albeit with a much larger performance hit as it is SS instead of MS)

ATI hands down wins in AF (THe Angle Independent AF is really really nice); couple that with the HDR+AA support and it becomes more noticable.

I dont think this really brought out anything that people didn't know. It was generally accepted that ATI had the edge in IQ. What this did do was prove fanboys wrong on both sides. THe differences are not that drastic and further reinforces the fact that you anyone who said they couldn't stand something from one company or another is BSing. Neither sides downsides are large enough to warrant a change in cards.

-Kevin

Since IQ is basically equal between the cards, as this review proves and the vast majority of the unbiased members here have said.

I dont really care about ATIs disappearing fence trick.
But the overexaggeration on ATI Bloom and HDR in Oblivion will burn out a cornea.

what about the "needle in the eyes" shimmering. OMFG don't turn it into a flamewar. UGH I swear you and joker are the same person with 2 opposite personalities....
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: ArchAngel777
Originally posted by: nitromullet
Where are you going with this?
Originally posted by: ArchAngel777
Do the differences really subtract from the gameplay though?
Originally posted by: nitromullet
It doesn't ruin the game for me, provided it's a good game, but I do notice it.
Question asked and answered.


Yes, that is why in my next post I asked another question, or rather, more or less clarified it. Read up a few posts.

But to expound, you said you notice things like AA and AF or what not, but that would be comparing it to games with no AF or AA. Talking apples and apples with both systems comparing with similar settings, would you be able to tell the difference without a side by side shot comparison? Does my question make sense?

That was my response to you next post...

My point was that this thread is about IQ, not if you, me, or anyone notices it without paying attention to it. I did notice becasue I was paying attention to it, and I wanted to take a more careful look.

Everyone else in this thread seems to be interested in the topic, but you for some reason want to question the value of the comparison and give us your review of HL2. Take the thread and its contents for what it is. Either you care or you don't, but you don't need to convince me of anything.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: nitromullet
Originally posted by: ArchAngel777
Originally posted by: nitromullet
Where are you going with this?
Originally posted by: ArchAngel777
Do the differences really subtract from the gameplay though?
Originally posted by: nitromullet
It doesn't ruin the game for me, provided it's a good game, but I do notice it.
Question asked and answered.


Yes, that is why in my next post I asked another question, or rather, more or less clarified it. Read up a few posts.

But to expound, you said you notice things like AA and AF or what not, but that would be comparing it to games with no AF or AA. Talking apples and apples with both systems comparing with similar settings, would you be able to tell the difference without a side by side shot comparison? Does my question make sense?

That was my response to you next post...

My point was that this thread is about IQ, not if you, me, or anyone notices it without paying attention to it. I did notice becasue I was paying attention to it, and I wanted to take a more careful look.

Everyone else in this thread seems to be interested in the topic, but you for some reason want to question the value of the comparison and give us your review of HL2. Take the thread and its contents for what it is. Either you care or you don't, but you don't need to convince me of anything.


Hey, take a breather bud, I am not trying to convince you of anything. I am simply interested if you can tell the difference. It would seem to indicate otherwise since you said the following

Do you still have your GTX? Honestly, I had agreed with you prior to making these screenshots, and I had somewhat expected the IQ and sharpness to be considerably better with the XTX. However. after running these two cards back-to-back in the exact same locations in the games, the difference really isn't noticable. The default color saturation on the XTX is generally more pleasing to the eyes though (it's warmer), and i think this actually has a lot to do with what you are experiencing.

So, here we have an example of you essentially thinking that there was a difference, and then after comparing that there wasn't much of a difference at all. I hope you can understand why I am asking this in the first place.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
So, here we have an example of you essentially thinking that there was a difference, and then after comparing that there wasn't much of a difference at all. I hope you can understand why I am asking this in the first place.
As far as being able to tell the difference, that will be up to you. In some games I'm pretty sure that I could tell you which was which on certain maps, while in others I probably can't. I guess the whole reason for taking and posting the screenshots was to challenge my pre-conceived notions of which card should look better.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: RobertR1
Catalyst AI has no effect on IQ unlike Quality and HQ setting on nvidia cards. This is not a flame, just the way it is.

Nitro, would you mind retaking those oblivion screen shots in HDR/4xAA with the xtx. I'm curious as if it's just the Bloom effect that is exaggerated on the Ati cards or HDR also.

I just don't know what to say here.

 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: 5150Joker
Nitro, either you or HardOCP is doing something wrong with the HDR shots because this is what they had to say in their 7950 review:

We did not notice any differences what-so-ever between the platforms as far as shaders go. We looked very closely at HDR quality between NVIDIA and ATI in GRAW, Half Life 2: Episode 1 and Oblivion. Both platforms had the same colors, the same intensity, the same tone, the same glare and shine. There were no differences to the naked eye.

Anyhow, in your Oblivion shots it looks like the nVidia card doesn't even have HDR enabled. As for AA quality, they had this to say:

When we compared ATI?s AA and NVIDIA?s AA with their DEFAULT settings we still found ATI to provide better image quality. However, once we enabled ?Gamma Correct AA? on the NVIDIA GPU based video cards this changed. Now with our naked eye looking at games like Half Life 2: Episode 1, FEAR, and World of Warcraft we did not notice any differences between the cards at 2X or 4X AA. This was side-by-side, the same scenes (using many different saved points in HL2: EP1) and we saw no difference with our own eyes in-game.

Now, we looked into this further by taking screenshots and zooming into them in Photoshop at 300x their normal size. What we found is still at 2X AA ATI is ever so slightly better, the colors just seem more blended and less ?harsh? than NVIDIA has. However, at 4X AA they looked damn near identical.

So to gamers who are not going to be zooming into static images in Photoshop but are just going to be playing the game AA image quality looks the same in-game. Just make sure you enable ?Gamma Correct AA? from the driver control panel.

Yes nVidia has higher AA modes for single cards but that doesn't mean they have better AA.

What about what people like Nitro, myself, and other board members have to say? Does HardOCP overide us or something? Nah, this is the way I like it. You have an utterly neutral member like Nitro doing comparisons and sharing with everyone. THIS is what the vid forum should be for. Things like this.

 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: ArchAngel777
Originally posted by: nitromullet
Originally posted by: ArchAngel777
Originally posted by: nitromullet
Where are you going with this?
Originally posted by: ArchAngel777
Do the differences really subtract from the gameplay though?
Originally posted by: nitromullet
It doesn't ruin the game for me, provided it's a good game, but I do notice it.
Question asked and answered.


Yes, that is why in my next post I asked another question, or rather, more or less clarified it. Read up a few posts.

But to expound, you said you notice things like AA and AF or what not, but that would be comparing it to games with no AF or AA. Talking apples and apples with both systems comparing with similar settings, would you be able to tell the difference without a side by side shot comparison? Does my question make sense?

That was my response to you next post...

My point was that this thread is about IQ, not if you, me, or anyone notices it without paying attention to it. I did notice becasue I was paying attention to it, and I wanted to take a more careful look.

Everyone else in this thread seems to be interested in the topic, but you for some reason want to question the value of the comparison and give us your review of HL2. Take the thread and its contents for what it is. Either you care or you don't, but you don't need to convince me of anything.


Hey, take a breather bud, I am not trying to convince you of anything. I am simply interested if you can tell the difference. It would seem to indicate otherwise since you said the following

Do you still have your GTX? Honestly, I had agreed with you prior to making these screenshots, and I had somewhat expected the IQ and sharpness to be considerably better with the XTX. However. after running these two cards back-to-back in the exact same locations in the games, the difference really isn't noticable. The default color saturation on the XTX is generally more pleasing to the eyes though (it's warmer), and i think this actually has a lot to do with what you are experiencing.

So, here we have an example of you essentially thinking that there was a difference, and then after comparing that there wasn't much of a difference at all. I hope you can understand why I am asking this in the first place.

I can tell the difference. Only on certain maps however. Each card has "visible" advantages over the other, in different features. I believe a large reason for this thread was to either confirm, or put to rest, the alleged cataclysmic differences in IQ between NV and ATI's latest offerings.

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: keysplayr2003
Originally posted by: RobertR1
Catalyst AI has no effect on IQ unlike Quality and HQ setting on nvidia cards. This is not a flame, just the way it is.

Nitro, would you mind retaking those oblivion screen shots in HDR/4xAA with the xtx. I'm curious as if it's just the Bloom effect that is exaggerated on the Ati cards or HDR also.

I just don't know what to say here.

I took two shots from the same place with bloom and HDR with the XTX (both 4xAA/8xAF), without the illumination spell. They look less washed out. Makes me wonder why the GX2 doesn't show the spell illumination as much.

screenies

Keys, the new screenshots are already up on your ftp site. That mouse over rocks btw :)
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: keysplayr2003
Originally posted by: RobertR1
Catalyst AI has no effect on IQ unlike Quality and HQ setting on nvidia cards. This is not a flame, just the way it is.

Nitro, would you mind retaking those oblivion screen shots in HDR/4xAA with the xtx. I'm curious as if it's just the Bloom effect that is exaggerated on the Ati cards or HDR also.

I just don't know what to say here.


That is the theory of AI, on low it gives game specific optimizations that do not have any effect on IQ, except of course to improve it (like hdr with AA). Now with these vid companies, theory and practice can differ.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Much thanks to ken and others for the pictures, and hosting.

Originally posted by: Praxis1452
what about the "needle in the eyes" shimmering. OMFG don't turn it into a flamewar. UGH I swear you and joker are the same person with 2 opposite personalities....

Oh, its there; http://www.newegg.com/Product/ProductLi...ption=&srchInDesc=&minPrice=&maxPrice=

HardOCP gives ATi the image quality crown, and talks about various parts of IQ. What I like most about Hard, is that they go more indepth, and use settings most others do not. Gone are the canned benchmarks, with no mention of IQ.

What Hard said about AA:
When we compared ATI?s AA and NVIDIA?s AA with their DEFAULT settings we still found ATI to provide better image quality. However, once we enabled ?Gamma Correct AA? on the NVIDIA GPU based video cards this changed. Now with our naked eye looking at games like Half Life 2: Episode 1, FEAR, and World of Warcraft we did not notice any differences between the cards at 2X or 4X AA. This was side-by-side, the same scenes (using many different saved points in HL2: EP1) and we saw no difference with our own eyes in-game.

Now, we looked into this further by taking screenshots and zooming into them in Photoshop at 300x their normal size. What we found is still at 2X AA ATI is ever so slightly better, the colors just seem more blended and less ?harsh? than NVIDIA has. However, at 4X AA they looked damn near identical.

Basically, its a draw if you use gamma corrected AA. Unless of course you use 8xAA for NV cards, it will then look better. However, no game was ever playable with it for me, sadly. Not even with SLI. So it was a worthless option, for me.

What they said about AF:
While regular AF was definitely identical in-game, turning on ATI?s special ?High Quality? AF turned the tables big-time. ATI gives you a wonderful option of being able to have a less angle dependent form of AF enabled. What this means is that at and around the 45 degree angles on textures ATI can do more filtering than NV can. Any texture that is on a steep angled surface will receive better AF.

This made a huge difference in Oblivion with its outdoor scenes having rolling hills and steep terrain angles. This also helped in Half Life 2: Episode 1. This didn?t help too much in FEAR which is a rather dark game and has mostly 90 degree angles. This also didn?t help much in Age of Empires III which also doesn?t have any steep angles. This did though help immensely in World of Warcraft which has large outdoor scenes.

Any way you slice it ATI has the upper hand in AF with the option to enable ?High Quality? AF.

What they said about shimmering;
Ok, let?s just get right to it shall we. NVIDIA has worse texture crawling and moiré with the default driver settings than ATI does. Notice I said worse, because ATI is not entirely out of the woods, as they have it as well, just to a lesser extent.

World of Warcraft has to be the worse game for texture crawling that we have come across. It is just downright annoying. You can notice it outdoors on cobble stone or dirt paths mostly. You can also see it on the grass texture but since it is a dark color it is harder to spot.

Texture crawling, as we are calling it, is basically where it looks like there are little marching ants crawling on your texture as you move away and toward it. For example walking down a path in WoW and you see that on the ground is a wavy or shimmering pattern that moves with you. You can read all about Moiré patterns here. It is really bad on NVIDIA GPUs with default driver settings in WoW. ATI also suffers from this but it isn?t as bad as it is on NVIDIA GPUs.

Another game where we saw this was in Half Life 2 and Half Life 2: Episode 1. We didn?t notice it in the Citadel maps in Episode 1, but once we got outside in City 17 we started seeing it on the ground. Moiré was also very noticeable in HL2: EP 1 outdoors.

Texture crawling and moiré are not something you can see in a screenshot, it can only be represented with movement. In the above screenshot of Half Life 2: Episode 1 we out outlined a portion of road in this map. This is one area we saw texture crawling and moiré as we moved down the road. It was visible on both ATI and NVIDIA hardware, but worse on NV hardware. Any places with detailed textures like this road you can spot texture crawling.

The great news is that this can be reduced on NVIDIA hardware, the bad news is that it takes a performance hit to do so. You can manually turn off all the filtering options in the advanced driver control panel as well as clamp the LOD bias. This greatly reduces it, but it doesn?t entirely do away with it, and it also takes a performance hit to do so. Still, if you want the best texture quality you will have no choice but to take the hit.

Overall ATI has the better texture which by default has less texture crawling and allows the ?High Quality? AF option.

Overall image quality;
We took an in-depth look at image quality in this evaluation. We had two 30? LCD?s side-by-side and were able to look at IQ in-game. Culminating everything we learned we can confidently say that the ATI Radeon X1900 XTX CrossFire platform offered the best image quality in games.

We have all the proof to back this up. With the Radeon X1900 XTX and CrossFire platform antialiasing plus HDR is possible in Oblivion. ATI has the ability to do multisampling plus floating point blending HDR. ATI has a ?High Quality? anisotropic filtering option which has real tangible image quality benefits in games with little to no performance hit. In large outdoor games this is a huge benefit. It also helps in games that have very high quality and detailed textures like Half Life 2: Episode 1 and Ghost Recon. Having these two displays side-by-side proved that texture crawling and moiré are worse on NVIDIA hardware at default driver settings compared to ATI hardware. We especially noticed this in World of Warcraft, Half Life 2: Episode 1, and Battlefield 2. When you look at all this added up it becomes clear that ATI still holds the image quality crown.

From my personal expierences, ATi to me, has the IQ crown. Mainly for two reasons, on my setup, NV shimmers much, much worse than ATi. So much, its down right distracting. Not everyone has the hardware that shows it as much as mine, so its not going to be a problem for most people. Second, HQ AF is for real, and I can easily see the difference. AA quality at 2x and 4x looks to the same to me while playing games. Its a wash to me. 8x for NV looks better than 6x for ATi, but since it wasnt playable for me (res of 1920x1200), it was a useless option.

Most people here, have not seen, or have had NV and ATi highend hardware in their personal PC. And still shots only show so much. Its something you really have to see in person. Everyone has their own opinion, its mine, that ATi has better IQ overall. Just as it is HardOCPs when having both LCD's side by side, one with NV, and the other with ATi. They said ATi looked better as well.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: keysplayr2003

What about what people like Nitro, myself, and other board members have to say? Does HardOCP overide us or something? Nah, this is the way I like it. You have an utterly neutral member like Nitro doing comparisons and sharing with everyone. THIS is what the vid forum should be for. Things like this.


Well I always doubt claims of certain people but I do believe what Nitro has to say since I do consider him impartial. That's why I asked if him or HardOCP made a mistake without outright claiming his shots were wrong. Anyhow, I too have a 7900 GS and ATi X1900XTX and may do an image quality comparison of my own when I get back home in a few days.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Well I always doubt claims of certain people but I do believe what Nitro has to say since I do consider him impartial. That's why I asked if him or HardOCP made a mistake without outright claiming his shots were wrong. Anyhow, I too have a 7900 GS and ATi X1900XTX and may do an image quality comparison of my own when I get back home in a few days.

I would actually like to see someone else post some Oblivion screen caps with a NV card with bloom+AA. I would re-check my own cap to make sure that bloom was on as well, but the GX2 is on a brown truck somewhere to its new home, so that won't be possible. I'd also like to see some comparisons from different games.