VR-Zone x2900xt review

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
are reading the same review?
:confused:

as to the IQ

Results seem almost indistinguishable, even at a blown up look such as this. If one tries to knit pick, it seems that the filtering is only slightly rougher on the ATi where by the transition in color is very slightly less smoother than the NVIDIA, from one pixel to the next.
As seen on Company Of Heroes, the quality is so close it's rather tough to distinguish. Similarly, the filtering seems to be slightly rougher on the ATi, I'm leaning a bit towards the preference of NVIDIA's filtering which comes across slightly smoother. But more scenes and checks must be conducted.
Even this is really hard to pick from. If you want to knit pick, you may find that from one small part, it seems the filtering is a little sharper on the NVIDIA:
No clear conclusion
Vegetation seems to be a fair bit more different. Here, you can acutely see that the 8800GTS renders the closer bunch of leaves sharper, yet rougher. There's a clearer distinction in colors of bunches of leaves on the GTS. This may also mean slightly more aliasing with the resultant effect being some "texture swimming" during movements of your character or as the wind blows.

FINALLY ...
At normal view screen size you can already tell the difference. Look at the gun, that's closest to your character, you can see it starkly sharper on the 8800 compared to the 2900. Notice especially the words and buttons on the gun. Words are sharper and so are the finer details on the buttons on the card below. Even the dirt track further away from the character looks sharper rendered on the 8800.
you got me :p

OK .. ati fanboys and nvidiots hae been arguing over this one for years...

more
In terms of Jaggie-elimination Anti-Aliasing, the maximum capability of the X2900XT is very close to the capability of the 8800GTS. Remember, this is 16x Sampling Wide-Tent AA at 8x Level versus 16x Quality AA.

Ah yes, we can see quite a difference now. The Anti-Aliasing is better on the 2900 card as compared to the 8800 card when both is set to run the maximum AA capability.

Well, it does seem that there is an Image Quality Improvement on the X2900XT cards as compared to previous generation X1950 cards. The X2900 outshines the GeForce 8800 in Anti-Aliasing Quality slightly, while the Anisotropic Filtering on the 8800 still seems to be a little more accurate compared to the X2900XT.
doethe IQ on your 1900xtx "suck"

it is IMPROVED in the 2900
:roll:

:D
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Extelleron
If you stare at a still screenshot, you might see a slight difference between ATI and nVidia. If that's what you want out of your GPU, then fine, good for you. I prefer to actually play the games.

Anyway, has anyone tested what ATI's IQ looks like w/ 24xAA + 16xHQAF?

That war cry didnt work with nvidia during the G7x days and it wont work for ATI now.
 

swtethan

Diamond Member
Aug 5, 2005
9,071
0
0
Originally posted by: Matt2
Originally posted by: Extelleron
If you stare at a still screenshot, you might see a slight difference between ATI and nVidia. If that's what you want out of your GPU, then fine, good for you. I prefer to actually play the games.

Anyway, has anyone tested what ATI's IQ looks like w/ 24xAA + 16xHQAF?

That war cry didnt work with nvidia during the G7x days and it wont work for ATI now.

6 months late and very little competition, GG! I would have sold my GTS if the 2900 was a better card, guess not.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Matt2
Originally posted by: Extelleron
If you stare at a still screenshot, you might see a slight difference between ATI and nVidia. If that's what you want out of your GPU, then fine, good for you. I prefer to actually play the games.

Anyway, has anyone tested what ATI's IQ looks like w/ 24xAA + 16xHQAF?

That war cry didnt work with nvidia during the G7x days and it wont work for ATI now.

The GeForce 7 series had terrible AF with the shimmering effect, and overall the X1000 series had NOTICEABLY better IQ. The GeForce 8 series MIGHT have SLIGHTLY better IQ that's noticeable in still screenshots under certain scenarios. There's a big difference there.

Anyway, read the Tweaktown HD 2900XT in the other thread. IDK if they were using newer drivers or just got slightly different results, but 2900XT and 8800GTX are very, very close at 2560x1600, which means the GTS gets blown away.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: apoppin
Originally posted by: Matt2
Very disappointing.

:(

Nvidia has a faster card and better IQ. When was the last time that happened?

with x1800

with r8500

with Radeon DDR

and with Rage Fury 32

only the nvidia IQ is NOT better ... thos pics last night are photo chop

and *remember* we have a $400 card competing pretty well with a $650 one
-with *major* driver improvements coming and likely better DX10 perf too

not to mention those gawd-awful nvidia drivers that they STILL can''t get right --- after SIX long months

i know which one i will pick

[2950xt] ;)

Did you read the VR-Zone review? Right now the X2900 XT is competing fairly against the 8800 GTS 640, it's faster in some things and even in others and ALOT slower in some due to driver imaturity. It also consumes about 65W more then the GTS so it at least needs a 500W PS, and 2x6Pin PCI-E connectors to run minimum as that is the typical mininum level before you seen 2x6Pin PCI-E Connectors.

Nvidia has sharper AF quality this generation, with ATI having the better MSAA quality as their 24xAA vs 16xQ AA setting so overall. So image quality would depend on what you prefer AA or AF, depending on which settings as well, I wonder what things are like at 8XQ vs ATi's 8XAA.

I think this card overall performance wise should be quicker then the 8800 GTS 640 once the drivers mature more, but once again you have the issue, of a card that consumes about 50% more juice that is not significantly faster in games.

MSRP of the 8800 GTX is 599USD not sure where your getting 650 from, and it can be had for as cheap as 550USD nowadays due to the sheer amount of time it's been on the market.

Given the right price I would say depending on your needs the X2900 XT could be good, but it really depends on what you prefer. I probably can't consider this card until the drivers mature a bit, same idea with Vista. ;)

I am not willing to play guinea pig for Microsoft or ATi/AMD.
 

rise

Diamond Member
Dec 13, 2004
9,116
46
91
Originally posted by: Bateluer
I want to see how the 2900XT performs with more refined drivers. Some of those tests are obviously driver related.
same here, i'll ge one depending on the iq issue.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
There is also something fishy about their NWN2 numbers. My 1900XT runs NWN2 almost double that of their 1950XTX. Could be driver related as well, but you'd think the 1950XTX would perform better than its predecessor.
 

swtethan

Diamond Member
Aug 5, 2005
9,071
0
0
Originally posted by: Bateluer
There is also something fishy about their NWN2 numbers. My 1900XT runs NWN2 almost double that of their 1950XTX. Could be driver related as well, but you'd think the 1950XTX would perform better than its predecessor.

If there is no video test, they have to make their own so it WILL be more different than your results.
 

enz660hp

Senior member
Jun 19, 2006
242
0
0
This card was made to compete with the 8800gts. I believe with further driver improvements, it probably will be a bit faster, but again, at the cost of heat & power. I wonder what an hr-03+ could do to that card...

*sigh* time to wait another 6 months for their top models... by that time, nvidia will probably release the 8900, etc etc...
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Extelleron
Originally posted by: Matt2
Originally posted by: Extelleron
If you stare at a still screenshot, you might see a slight difference between ATI and nVidia. If that's what you want out of your GPU, then fine, good for you. I prefer to actually play the games.

Anyway, has anyone tested what ATI's IQ looks like w/ 24xAA + 16xHQAF?

That war cry didnt work with nvidia during the G7x days and it wont work for ATI now.

The GeForce 7 series had terrible AF with the shimmering effect, and overall the X1000 series had NOTICEABLY better IQ. The GeForce 8 series MIGHT have SLIGHTLY better IQ that's noticeable in still screenshots under certain scenarios. There's a big difference there.

Anyway, read the Tweaktown HD 2900XT in the other thread. IDK if they were using newer drivers or just got slightly different results, but 2900XT and 8800GTX are very, very close at 2560x1600, which means the GTS gets blown away.

Except for the fact that if you set the G7x series to HQ and set the LOD to clamp, the shimmering went away completely and then the IQ difference was pretty negligable.

if R600 pulls "close" to 8800GTX at 2560x1600, it may be a hollow victory for HD2900XT at this point. Not a lot of people play at such a high res and without an 8800GTS pace card in the mix, it's hard to tell just how much better it is.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Nightmare225
Who else is waiting for the true battle: 8900 vs HD 2950?

I think that's where I'm going to get my DX10 card.

I think I'll just throw in an X1900 CF card and wait till 8900 vs 2950
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: Pugnate
You mean the HD2950 vs the 8800GTX?

That might be competitive.

That was pretty funny, ;)

But no, I meant:

8900GTX vs HD2950XTX
8900GTS vs HD2950XT
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Originally posted by: swtethan
Originally posted by: Bateluer
There is also something fishy about their NWN2 numbers. My 1900XT runs NWN2 almost double that of their 1950XTX. Could be driver related as well, but you'd think the 1950XTX would perform better than its predecessor.

If there is no video test, they have to make their own so it WILL be more different than your results.

My frame rate almost never drops that low in actual gameplay. What did they create, a neverending battle scene with 2 dozen characters and spells going off every split second.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Matt2
Originally posted by: Extelleron
Originally posted by: Matt2
Originally posted by: Extelleron
If you stare at a still screenshot, you might see a slight difference between ATI and nVidia. If that's what you want out of your GPU, then fine, good for you. I prefer to actually play the games.

Anyway, has anyone tested what ATI's IQ looks like w/ 24xAA + 16xHQAF?

That war cry didnt work with nvidia during the G7x days and it wont work for ATI now.

The GeForce 7 series had terrible AF with the shimmering effect, and overall the X1000 series had NOTICEABLY better IQ. The GeForce 8 series MIGHT have SLIGHTLY better IQ that's noticeable in still screenshots under certain scenarios. There's a big difference there.

Anyway, read the Tweaktown HD 2900XT in the other thread. IDK if they were using newer drivers or just got slightly different results, but 2900XT and 8800GTX are very, very close at 2560x1600, which means the GTS gets blown away.

Except for the fact that if you set the G7x series to HQ and set the LOD to clamp, the shimmering went away completely and then the IQ difference was pretty negligable.

if R600 pulls "close" to 8800GTX at 2560x1600, it may be a hollow victory for HD2900XT at this point. Not a lot of people play at such a high res and without an 8800GTS pace card in the mix, it's hard to tell just how much better it is.

The GeForce 7 had fine image quality if you turned all the driver settings up, but then let's see the performance. The 7800/7900 could barely compete in performance WITH the default driver settings... turn everything up and lets see how playable it is.

Anyway, before everyone jumps on the "8800 IQ is better" bandwagon... I don't see the difference between any of those shots and apparently neither did the reviewer.... he said their MIGHT have been a very small difference in favor of nVidia... because of those words everyone on here starts talking about how sucky ATI IQ is.

 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Extelleron
The GeForce 7 had fine image quality if you turned all the driver settings up, but then let's see the performance. The 7800/7900 could barely compete in performance WITH the default driver settings... turn everything up and lets see how playable it is.

OMG, are you serious right now?

Turning the quality setting to HQ did incur a performance hit, but in no way shape or form made it "unplayable".
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Matt2
Originally posted by: Extelleron
The GeForce 7 had fine image quality if you turned all the driver settings up, but then let's see the performance. The 7800/7900 could barely compete in performance WITH the default driver settings... turn everything up and lets see how playable it is.

OMG, are you serious right now?

Turning the quality setting to HQ did incur a performance hit, but in no way shape or form made it "unplayable".

I can't say I've ever had a 7900GTX to try it out, but those tests that were run at high IQ settings (high quality AA +AF) saw a huge difference between X1900 series cards and the 7900 series, much larger than with normal IQ settings.

BTW I might sound like the biggest ATI fanboy in the world to you, but that's not really true, or at least it wasn't until recently. When I built my PC last March I ordered a 7900GT and would have gone with that had it not been for Monarch taking too long to ship it out. Instead I ordered an X1900XT at newegg. This January I purchased an 8800GTS, so it's not like I've never had an nVidia card and am a total ATI fan.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Matt2
Very disappointing.

:(

Nvidia has a faster card and better IQ. When was the last time that happened?

Yeah, it looks like the GTS beats it on every front. Faster, Cheaper, Better IQ, less heat, less noise, less power used, better drivers.

I was hoping after 7 months that we would see something truly amazing.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: Extelleron
Originally posted by: Matt2
Originally posted by: Extelleron
The GeForce 7 had fine image quality if you turned all the driver settings up, but then let's see the performance. The 7800/7900 could barely compete in performance WITH the default driver settings... turn everything up and lets see how playable it is.

OMG, are you serious right now?

Turning the quality setting to HQ did incur a performance hit, but in no way shape or form made it "unplayable".

I can't say I've ever had a 7900GTX to try it out, but those tests that were run at high IQ settings (high quality AA +AF) saw a huge difference between X1900 series cards and the 7900 series, much larger than with normal IQ settings.

Depends on the game, there are some cases of 20% differences between the X1950 XTX vs the 7900 GTX, talking about 4xAA/16XAF settings. But even with those differences the 7900 GTX was not unplayable.

The below review illustrates the differences between the X1950 XTX and 7900 GTX even with HQ Settings.

http://www.xbitlabs.com/articles/video/display/gf8800-games.html
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Nightmare225
Who else is waiting for the true battle: 8900 vs HD 2950?

I think I may wait. By then, both ATi and NV will have their driver issues worked out hopefully. And DX10 games just starting to show up. Its pretty clear that the performance will increase for the 2900XT with newer drivers.

But I may be over seas again.. so I may have to wait even longer. ;)
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Originally posted by: Fallen Kell
Basically they say the image quality is good, but Nvidia's is better. The screen shots they have tend to show that as well. That said, it is very close for some things, but when you get down to it, from what they saw and show in the article, Nvidia does have a crisper look.

My how the mighty have fallen... ATi was traditionally the IQ champion. :(
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: coldpower27
Originally posted by: Extelleron
Originally posted by: Matt2
Originally posted by: Extelleron
The GeForce 7 had fine image quality if you turned all the driver settings up, but then let's see the performance. The 7800/7900 could barely compete in performance WITH the default driver settings... turn everything up and lets see how playable it is.

OMG, are you serious right now?

Turning the quality setting to HQ did incur a performance hit, but in no way shape or form made it "unplayable".

I can't say I've ever had a 7900GTX to try it out, but those tests that were run at high IQ settings (high quality AA +AF) saw a huge difference between X1900 series cards and the 7900 series, much larger than with normal IQ settings.

Depends on the game, there are some cases of 20% differences between the X1950 XTX vs the 7900 GTX, talking about 4xAA/16XAF settings. But even with those differences the 7900 GTX was not unplayable.

The below review illustrates the differences between the X1950 XTX and 7900 GTX even with HQ Settings.

http://www.xbitlabs.com/articles/video/display/gf8800-games.html

When you go for the High IQ, with 6xAA (ATI) or 8xAA (nVidia), the 7900GTX is less than half as fast as the X1950XTX.