7950GX2 and X1900XTX screenshot comparison

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Excellent review!!! I guess both sides have their de facto strengths.

Nvidia hands down wins in AA (Albeit with a much larger performance hit as it is SS instead of MS)

ATI hands down wins in AF (THe Angle Independent AF is really really nice); couple that with the HDR+AA support and it becomes more noticable.

I dont think this really brought out anything that people didn't know. It was generally accepted that ATI had the edge in IQ. What this did do was prove fanboys wrong on both sides. THe differences are not that drastic and further reinforces the fact that you anyone who said they couldn't stand something from one company or another is BSing. Neither sides downsides are large enough to warrant a change in cards.

-Kevin
 

CKXP

Senior member
Nov 20, 2005
926
0
0
thanks nitromullet & keysplayr2003 great job to the both of you...this is something that the video forum needed :thumbsup:


 

bjc112

Lifer
Dec 23, 2000
11,460
0
76
Originally posted by: nitromullet
Originally posted by: AzNPinkTuv
7950gx2 looks like crap compared to the xtx.. just look in WoW at the tree behind camp taurajo... i mean it looks all blurred and ****** and looks alot worse when compared to the xtx
Well, overall the XTX image is more crisp. I have noticed that during gameplay, so I was surprised that it wasn't quite as apparent when I was doing the screenshots. WoW is a tough one for me really becasue even though the XTX has a more crisp picture, the 8xAA on the GX2 really is better than the XTX's 6xAA in WoW.

I'm actually quite surprised at how really different both cards render the same games. Honestly, I wish I could have the AA of NV and AF of ATI in a single gpu.



I almost think the GX2 looks better than the xtx in some of those screens.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Great pics guys. I'm going to have to install HL2 (haven't played it yet with my XTX)
 

Sc4freak

Guest
Oct 22, 2004
953
0
0
Originally posted by: GundamSonicZeroX
OMG! I can't believe I got freaked out over this! The difference is NOT that big.

Look at that first HL2 shot - the textures on the wall to the right are horrendously blurry on the GX2. And on the fence, though, the GX2 seems to leave more aliasing in the fence, but the XTX just makes it disappear... Honestly I prefer Nvidia's TRAA over ATI's AAA, but ATI's AF is much better than Nvidia's.
 

Fadey

Senior member
Oct 8, 2005
410
6
81
also the gx2 can have higher aa and af than a xtx in most games... and uses less power , unless its crossfire then gg with your 2 hot vaccum cleaners.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Fadey
also the gx2 can have higher aa and af than a xtx in most games... and uses less power , unless its crossfire then gg with your 2 hot vaccum cleaners.

Arguably, the GX2 can run higher AA than the XTX an any game, since 8xAA can be selected even with a single NVIDIA gpu, whereas the XTX maxes out at 6xAA. IMO, and so far most people agree with me, the GX2 does have better AA, but the XTX has better AF.

As far as heat/noise/power/cost comparisons, we've all heard the rhetoric (from both sides) before, let's keep this thread an IQ discussion.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: Rage187
Originally posted by: dug777
ATI's AF is just so much better...

now only if they could get close to Nv's AA.

The only difference i can see in AA is the fence, whereas the AF difference is massive.
 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,345
136
www.teamjuchems.com
Wow, nice work Nitro and Keys...

I guess I am going to start looking for an ATI card again, the AF on my nvidia drives me nuts, but I guess I already posted this too :)

Luckily I won't be playing HL2 again any time soon, other than a quick run through EP1, so that shouldn't bother me too much. AA+HDR is more important to me.

Thanks guys!

:thumbsup:
 

Bull Dog

Golden Member
Aug 29, 2005
1,985
1
81
Originally posted by: nitromullet
Originally posted by: AzNPinkTuv
7950gx2 looks like crap compared to the xtx.. just look in WoW at the tree behind camp taurajo... i mean it looks all blurred and ****** and looks alot worse when compared to the xtx
Well, overall the XTX image is more crisp. I have noticed that during gameplay, so I was surprised that it wasn't quite as apparent when I was doing the screenshots. WoW is a tough one for me really becasue even though the XTX has a more crisp picture, the 8xAA on the GX2 really is better than the XTX's 6xAA in WoW.

I'm actually quite surprised at how really different both cards render the same games. Honestly, I wish I could have the AA of NV and AF of ATI in a single gpu.


Not to mention HDR+AA with X1 ATI cards. I play Oblivion at 1440x900 2xAA 16xAF and most settings maxed. I get 25+ FPS outdoors and that good enough for me. Incidently, 1440x900 with 2xAA looks better than 1680x1050 with no AA. Performance is about the same too.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
I think only the most anal minded people would care about the difference in IQ in these cards. I am pretty picky, but certainly not as anal as some people (I won't mention names!) but even, I, could care less about these extremely minor differences. As far as I am concerned, both ATI and nVidia are on par with each other in IQ and it really is a moot point.

I don't know about the rest of you, but when I am playing, I am not singling a 4 X 4 pixel segment of my monitor looking for issues with it, I simply look at the bigger picture (I play the darn game).

Edit: And for the record, I am not denying that there are differences between both of them, I am just saying that they are very, very, very minor differences in the overwhelming majority of the games.

BTW, great comparison, it just proves that both cards are top notch, IMO.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: ArchAngel777
I think only the most anal minded people would care about the difference in IQ in these cards. I am pretty picky, but certainly not as anal as some people (I won't mention names!) but even, I, could care less about these extremely minor differences. As far as I am concerned, both ATI and nVidia are on par with each other in IQ and it really is a moot point.

I don't know about the rest of you, but when I am playing, I am not singling a 4 X 4 pixel segment of my monitor looking for issues with it, I simply look at the bigger picture (I play the darn game).

Edit: And for the record, I am not denying that there are differences between both of them, I am just saying that they are very, very, very minor differences in the overwhelming majority of the games.

BTW, great comparison, it just proves that both cards are top notch, IMO.

Dammit, it's 'couldn't care less' unless you do care about something :p

Are you calling the massive differences in AF on the wall in that first hl2 pic an extremely minor difference? ;)
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Originally posted by: nitromullet
Originally posted by: Fadey
also the gx2 can have higher aa and af than a xtx in most games... and uses less power , unless its crossfire then gg with your 2 hot vaccum cleaners.

Arguably, the GX2 can run higher AA than the XTX an any game, since 8xAA can be selected even with a single NVIDIA gpu, whereas the XTX maxes out at 6xAA. IMO, and so far most people agree with me, the GX2 does have better AA, but the XTX has better AF.

As far as heat/noise/power/cost comparisons, we've all heard the rhetoric (from both sides) before, let's keep this thread an IQ discussion.


From your experience nitromullet, is the 8xAA useable in modern games at 1680x1050 or higher with max in-game details? I see 8xAA as a boon in older games as you still get playable framerates, but my experience with two 7800GTs@550/1300 was that, in modern games, 8xAA is still too taxing on the graphics system. I couldn't get playable framerates so I had to drop down to 4xAA which negated the superior AA advantage. Maybe the 7950GX2 is faster than my cards were so I'm wondering what your experience has been.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: dug777
Originally posted by: ArchAngel777
I think only the most anal minded people would care about the difference in IQ in these cards. I am pretty picky, but certainly not as anal as some people (I won't mention names!) but even, I, could care less about these extremely minor differences. As far as I am concerned, both ATI and nVidia are on par with each other in IQ and it really is a moot point.

I don't know about the rest of you, but when I am playing, I am not singling a 4 X 4 pixel segment of my monitor looking for issues with it, I simply look at the bigger picture (I play the darn game).

Edit: And for the record, I am not denying that there are differences between both of them, I am just saying that they are very, very, very minor differences in the overwhelming majority of the games.

BTW, great comparison, it just proves that both cards are top notch, IMO.

Dammit, it's 'couldn't care less' unless you do care about something :p

Are you calling the massive differences in AF on the wall in that first hl2 pic an extremely minor difference? ;)


Actually, it was designed to trick people so that I came accross that I didn't care, but really did care. Uhm... Yeah... Ok, I made a mistake and you are right.

Strong Opinion Warning: HL2 is crap, so I could care less about that game! Not sure why Valve got some much attention from that game. Graphics engine is sub-par, the "loading" every 1/4 mile in the game was anoying and the physics that people drooled about were so simplistic that they were laughable how people were captured by them. Another reason is that Valve cannot be respected as a company because of STEAM, and most of all, because they claimed their demo wasn't scripted and it was found to be scripted... There, just my opinion and maybe gives people a glimpse as to why I don't think the AF in HL2 is a big deal on certain cards, or any card for that matter :p
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Elfear

From your experience nitromullet, is the 8xAA useable in modern games at 1680x1050 or higher with max in-game details? I see 8xAA as a boon in older games as you still get playable framerates, but my experience with two 7800GTs@550/1300 was that, in modern games, 8xAA is still too taxing on the graphics system. I couldn't get playable framerates so I had to drop down to 4xAA which negated the superior AA advantage. Maybe the 7950GX2 is faster than my cards were so I'm wondering what your experience has been.

Well, yes and no. 8xAA is usable in HL2:EP1 (which I consider modern), but it's not in Oblivion, where 4xAA is about the most you can reasonably expect at 1680x1050.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: ArchAngel777
I think only the most anal minded people would care about the difference in IQ in these cards. I am pretty picky, but certainly not as anal as some people (I won't mention names!) but even, I, could care less about these extremely minor differences. As far as I am concerned, both ATI and nVidia are on par with each other in IQ and it really is a moot point.

I don't know about the rest of you, but when I am playing, I am not singling a 4 X 4 pixel segment of my monitor looking for issues with it, I simply look at the bigger picture (I play the darn game).

Edit: And for the record, I am not denying that there are differences between both of them, I am just saying that they are very, very, very minor differences in the overwhelming majority of the games.

BTW, great comparison, it just proves that both cards are top notch, IMO.

I guess I must be one of the anal people that you are referring to. I was actually quite surprised at the considerable differences between the two cards.

The thing that I am most pleased about the outcome thus far of this thread is the realization that there is a shortcoming (by comparison) with the XTX's AA. In the past the blanket statement has always been, "ATI has better IQ", which I would take to mean that ATI's output either looks as good or better, and not that there is a trade off.

On a related note, I had not disabled Catalyst AI (low setting in ATT) when I took the screens, so I tried that out as well. There was no visible change in the AA, and the fence still disappeared in HL2.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: nitromullet
Originally posted by: ArchAngel777
I think only the most anal minded people would care about the difference in IQ in these cards. I am pretty picky, but certainly not as anal as some people (I won't mention names!) but even, I, could care less about these extremely minor differences. As far as I am concerned, both ATI and nVidia are on par with each other in IQ and it really is a moot point.

I don't know about the rest of you, but when I am playing, I am not singling a 4 X 4 pixel segment of my monitor looking for issues with it, I simply look at the bigger picture (I play the darn game).

Edit: And for the record, I am not denying that there are differences between both of them, I am just saying that they are very, very, very minor differences in the overwhelming majority of the games.

BTW, great comparison, it just proves that both cards are top notch, IMO.

I guess I must be one of the anal people that you are referring to. I was actually quite surprised at the considerable differences between the two cards.

The thing that I am most pleased about the outcome thus far of this thread is that realization that there is a shortcoming (by comparison) with the XTX's AA. In the past the blanket statement has always been, "ATI has better IQ", which I would take to mean that ATI's output either looks as good or better, and not that there is a trade off.

On a related note, I had not disabled Catalyst AI (low setting in ATT) when I took the screens, so I tried that out as well. There was no visible change in the AA, and the fence still disappeared in HL2.

any chance you could post a new screenshot of that? I'm not suprised you couldn't see much difference tho, iirc when ATI introduced it they couldn't see any difference. High Quality settings i assume?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
It would be later today, as I am not at home. The settings are all High Quality on both cards, and the only optimization that I left enabled was Catayst AI. I've never noticed a difference with it on or off, and the Chuck patch requires that it be enabled for HDR to work in Oblivion so I tend to forget it's on.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: nitromullet
Originally posted by: ArchAngel777
I think only the most anal minded people would care about the difference in IQ in these cards. I am pretty picky, but certainly not as anal as some people (I won't mention names!) but even, I, could care less about these extremely minor differences. As far as I am concerned, both ATI and nVidia are on par with each other in IQ and it really is a moot point.

I don't know about the rest of you, but when I am playing, I am not singling a 4 X 4 pixel segment of my monitor looking for issues with it, I simply look at the bigger picture (I play the darn game).

Edit: And for the record, I am not denying that there are differences between both of them, I am just saying that they are very, very, very minor differences in the overwhelming majority of the games.

BTW, great comparison, it just proves that both cards are top notch, IMO.

I guess I must be one of the anal people that you are referring to. I was actually quite surprised at the considerable differences between the two cards.

The thing that I am most pleased about the outcome thus far of this thread is the realization that there is a shortcoming (by comparison) with the XTX's AA. In the past the blanket statement has always been, "ATI has better IQ", which I would take to mean that ATI's output either looks as good or better, and not that there is a trade off.

On a related note, I had not disabled Catalyst AI (low setting in ATT) when I took the screens, so I tried that out as well. There was no visible change in the AA, and the fence still disappeared in HL2.

Actually, you were not who I had in mind when I posted that, but I guess you would fall into that category after this post :p

Do the differences really subtract from the gameplay though? I mean, if you didn't compare the two actively, would you have really noticed a difference when running through the game? I have my doubts on this. But if so, then it just proves that IQ isn't a moot point for you and others who feel the same way. In this situation, I guess you need to go with what you like the best. I just never spot differences, or if I do, they never really bother me unless they are really bad (Like Far Cry looks when using 1.0 retail release, textures are distorted, bright, washed out, etc... Looks like crap) that kind of IQ would really bother me, but that would be broken IQ, IMO.

 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Catalyst AI has no effect on IQ unlike Quality and HQ setting on nvidia cards. This is not a flame, just the way it is.

Nitro, would you mind retaking those oblivion screen shots in HDR/4xAA with the xtx. I'm curious as if it's just the Bloom effect that is exaggerated on the Ati cards or HDR also.

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: ArchAngel777
Actually, you were not who I had in mind when I posted that, but I guess you would fall into that category after this post :p

Do the differences really subtract from the gameplay though? I mean, if you didn't compare the two actively, would you have really noticed a difference when running through the game? I have my doubts on this. But if so, then it just proves that IQ isn't a moot point for you and others who feel the same way. In this situation, I guess you need to go with what you like the best. I just never spot differences, or if I do, they never really bother me unless they are really bad (Like Far Cry looks when using 1.0 retail release, textures are distorted, bright, washed out, etc... Looks like crap) that kind of IQ would really bother me, but that would be broken IQ, IMO.
Well, I do notice things like texture detail, flicker, and aliased edges when I play a game even when I'm not looking for it. Things like that kind of jump out at me. It doesn't ruin the game for me, provided it's a good game, but I do notice it.