VR-Zone x2900xt review

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ronnn

Diamond Member
May 22, 2003
3,918
0
71
The big disapointment for me is no dx10 games yet. Hell my x1900gt is adequate for my monitor and is hard to justify $400 until a game I haven't played appeals. :beer:
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Can anyone explain why these geniuses who overclock with ridiculous cooling devices STILL don't bother to put any heatsinks on the RAM chips? How do you expect to get a higher RAM OC if you have nothing taking the heat away from the RAM chips?! This used to be a common sense part of OC'ing a videocard but I've seen a couple reviews now try overclocking without that and then wonder why the RAM seems to be at its max already.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: yacoub
Can anyone explain why these geniuses who overclock with ridiculous cooling devices STILL don't bother to put any heatsinks on the RAM chips? How do you expect to get a higher RAM OC if you have nothing taking the heat away from the RAM chips?! This used to be a common sense part of OC'ing a videocard but I've seen a couple reviews now try overclocking without that and then wonder why the RAM seems to be at its max already.

I saw a thread (I think it was at Hardocp) where someone made ramsinks outta pennies. :laugh:
 

swtethan

Diamond Member
Aug 5, 2005
9,071
0
0
Originally posted by: Wreckage
Originally posted by: yacoub
Can anyone explain why these geniuses who overclock with ridiculous cooling devices STILL don't bother to put any heatsinks on the RAM chips? How do you expect to get a higher RAM OC if you have nothing taking the heat away from the RAM chips?! This used to be a common sense part of OC'ing a videocard but I've seen a couple reviews now try overclocking without that and then wonder why the RAM seems to be at its max already.

I saw a thread (I think it was at Hardocp) where someone made ramsinks outta pennies. :laugh:

cheap cooling!!!
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: Extelleron
Originally posted by: coldpower27
Originally posted by: Extelleron
Originally posted by: Matt2
Originally posted by: Extelleron
The GeForce 7 had fine image quality if you turned all the driver settings up, but then let's see the performance. The 7800/7900 could barely compete in performance WITH the default driver settings... turn everything up and lets see how playable it is.

OMG, are you serious right now?

Turning the quality setting to HQ did incur a performance hit, but in no way shape or form made it "unplayable".

I can't say I've ever had a 7900GTX to try it out, but those tests that were run at high IQ settings (high quality AA +AF) saw a huge difference between X1900 series cards and the 7900 series, much larger than with normal IQ settings.

Depends on the game, there are some cases of 20% differences between the X1950 XTX vs the 7900 GTX, talking about 4xAA/16XAF settings. But even with those differences the 7900 GTX was not unplayable.

The below review illustrates the differences between the X1950 XTX and 7900 GTX even with HQ Settings.

http://www.xbitlabs.com/articles/video/display/gf8800-games.html

When you go for the High IQ, with 6xAA (ATI) or 8xAA (nVidia), the 7900GTX is less than half as fast as the X1950XTX.

I noticed that well in advance, but that differential exists with or without High Quality settings as 8XS is a much heavier mode on Nvidia hardware. At 4XAA settings High Quality doesn't make a difference. Most people who own Nvidia hardware don't play 8XS in any modern games its simply too performance demanding with or without HQ.

The point in to explain that HQ is not enough to make the 7900 GTX unplayable in games.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
still, despite nvidia's market superiority, the high end x18/19xx cards were clearly superior to nvidia's 7 series in both performance and image quality. even w/o the shimmering nvidia could not match ati's HQAF.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: CaiNaM
still, despite nvidia's market superiority, the high end x18/19xx cards were clearly superior to nvidia's 7 series in both performance and image quality. even w/o the shimmering nvidia could not match ati's HQAF.

This came at a price, they were not only more energy consuming they also were much more costly to make for ATI. Performance and image quality at whatever cost isn't always a good thing. Nvidia chose to sacrifice the performance and quality war in exchange for the financial war. A better long term decision it seems, as we now have the 8800 Series, which does the opposite but gets the advantage in time to market.

Not to mention quite late for the X1800's.


 

Falloutboy

Diamond Member
Jan 2, 2003
5,916
0
76
looks actaully pretty decent if the 2900xt streets at 350 and with a bit more driver work it will be on part with the GTS. I was hoping for the 2600 numbers since I"m in the market for something in the sub 200 range and was hoping to get a ballpark on those cards before I make a choice this week
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Falloutboy
looks actaully pretty decent if the 2900xt streets at 350 and with a bit more driver work it will be on part with the GTS. I was hoping for the 2600 numbers since I"m in the market for something in the sub 200 range and was hoping to get a ballpark on those cards before I make a choice this week

The 2600-series cards were delayed for several weeks, weren't they?
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: nullpointerus
Originally posted by: Falloutboy
looks actaully pretty decent if the 2900xt streets at 350 and with a bit more driver work it will be on part with the GTS. I was hoping for the 2600 numbers since I"m in the market for something in the sub 200 range and was hoping to get a ballpark on those cards before I make a choice this week

The 2600-series cards were delayed for several weeks, weren't they?

I believe it was into Q3... :(

A huge shame that NVIDIA's had a full laptop and desktop line-up before ATI released even one card...
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Not very impressed with this review.

Not much for high resolution results, where we already know the HD 2900 XT does better, & far too many stupid benches w/o AA.

Who the eff uses no AA these days...

No Vista results either :frown:
I am sick of seeing reviewing assuming people are still stuck using a 5 year old OS.

As for IQ, you guys are f*cking retarded.

AF was comparable between both 2900 & 8800s, maybe slightly better when blown up on 8800s.
How much of the time playing games are you going to be saving screenshots, & then blowing them up to gaze at the AF features for hours? :roll:

AA was arguably slightly better on the 2900 when blown up.
Again, since when do people spending hours staring @ enlarged screenshots of games? :roll:

Stop being f*cking morons, seriously.

I'll be waiting for more reviews showing the cards @ 2560x1600 (where i want to be able to play at) in Vista



 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: n7


Not much for high resolution results, where we already know the HD 2900 XT does better, & far too many stupid benches w/o AA.

I'm guessing you didn't read the article and vr-zone felt too embarrassed to post AA benchmarks because the HD2900 had some driver bugs, making it stay below even the X1950 in terms of performance. :laugh:

 

Capt Caveman

Lifer
Jan 30, 2005
34,543
651
126
Like others, no new games will allow me to wait longer for the 8900s and 2950s to come out. Better drivers, performance and hopefully better prices.
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
Originally posted by: n7
Not very impressed with this review.

Not much for high resolution results, where we already know the HD 2900 XT does better, & far too many stupid benches w/o AA.

Who the eff uses no AA these days...

No Vista results either :frown:
I am sick of seeing reviewing assuming people are still stuck using a 5 year old OS.

As for IQ, you guys are f*cking retarded.

AF was comparable between both 2900 & 8800s, maybe slightly better when blown up on 8800s.
How much of the time playing games are you going to be saving screenshots, & then blowing them up to gaze at the AF features for hours? :roll:

AA was arguably slightly better on the 2900 when blown up.
Again, since when do people spending hours staring @ enlarged screenshots of games? :roll:

Stop being f*cking morons, seriously.

I'll be waiting for more reviews showing the cards @ 2560x1600 (where i want to be able to play at) in Vista

The Battlefield 2142 screen isn't blown up and we can see a clear, noticeable different in image quality.

Thankfully though I ain't wasting my time with that game, but I do find it rather troubling to see that ATi couldn't do better. The IQ between GeForce 7 and X1K isn't even comparable ! Where the X1K's HQ setting for Anisotropic-Filtering simply destroyed any attempts by GeForce 7 at its own highest settings. Now I fear that the GeForce 8's A-F method at its highest quality is near perfect and cannot be much better, unless more samples are used (for example 24x or 32x A-F). It's different now, and blowing pictures up, down, left and right doesn't matter, it means that technically speaking the GeForce 8's A-F quality is not actually beaten by the new competition. The blowing up of pictures by the reviewer isn't there to show how people should play their games, don't act as if you thought that all players would do just that to give themselves good consciousness over their purchase. It means that with the player's awareness or not, X2K series' A-F quality is inferior, even to a negligible extent.

I am disappointed by that, because I can care less about A-A. If you keep playing your games with the highest amount of A-A you can have than that's your choice, you don't have to pretend that no one isn't using A-A at all anymore, because you got one just right there typing this who prefers A-F any time of the day over A-A. The jaggies are a minor concern for me when I run the fastest I can to the enemy base to get their flag, or when I am casting spells left and right to get out of the tomb alive, but the blurred textures on the walls, ceilings and ground up to the horizon are getting annoying. That's how I see it. Anisotropic-Filtering is important, as much as if not more than A-A for me.

Now, with that said, am I asking you to think the same way I do ? No. I am giving my point of view and as much as I like ATi, and even if I've been a consumer of ATi GPU's since the past four years all I can do right now is to see the matter objectively and notice, be aware that the situation about IQ has been turned upside down in favor of the GeForce 8 series. I do not like that. I wanted ATi to show nVidia how to do A-F "right", but nVidia has taken a step forward in advance, and even if the distance between them is so short, there is one of them closer to the end line than the other.

I am wondering about something though ...

Is the X2K's IQ hardware-limited, or is it a potential driver issue ? Will future driver updates bring slightly or significantly better IQ ? Is X2K's IQ subject to improvements in any way, shape or form ? I guess that time will tell ...

I am happy about its price, but I am unhappy and rather annoyed by the power-consumption and inferior A-F, these two points are my only real complaints about the new ATi generation.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: coldpower27
Originally posted by: apoppin
Originally posted by: Matt2
Very disappointing.

:(

Nvidia has a faster card and better IQ. When was the last time that happened?

with x1800

with r8500

with Radeon DDR

and with Rage Fury 32

only the nvidia IQ is NOT better ... thos pics last night are photo chop

and *remember* we have a $400 card competing pretty well with a $650 one
-with *major* driver improvements coming and likely better DX10 perf too

not to mention those gawd-awful nvidia drivers that they STILL can''t get right --- after SIX long months

i know which one i will pick

[2950xt] ;)

Did you read the VR-Zone review? Right now the X2900 XT is competing fairly against the 8800 GTS 640, it's faster in some things and even in others and ALOT slower in some due to driver imaturity. It also consumes about 65W more then the GTS so it at least needs a 500W PS, and 2x6Pin PCI-E connectors to run minimum as that is the typical mininum level before you seen 2x6Pin PCI-E Connectors.

Nvidia has sharper AF quality this generation, with ATI having the better MSAA quality as their 24xAA vs 16xQ AA setting so overall. So image quality would depend on what you prefer AA or AF, depending on which settings as well, I wonder what things are like at 8XQ vs ATi's 8XAA.

I think this card overall performance wise should be quicker then the 8800 GTS 640 once the drivers mature more, but once again you have the issue, of a card that consumes about 50% more juice that is not significantly faster in games.

MSRP of the 8800 GTX is 599USD not sure where your getting 650 from, and it can be had for as cheap as 550USD nowadays due to the sheer amount of time it's been on the market.

Given the right price I would say depending on your needs the X2900 XT could be good, but it really depends on what you prefer. I probably can't consider this card until the drivers mature a bit, same idea with Vista. ;)

I am not willing to play guinea pig for Microsoft or ATi/AMD.

did i read it ... i STUDIED it :p

agreed on all points ...

so what's your point?

i am *recommending* that people whao can wait - do so

and *i* am PROBABLY waiting for HD2950xt ;)

come again?
:confused:
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Originally posted by: Nightmare225
Originally posted by: n7


Not much for high resolution results, where we already know the HD 2900 XT does better, & far too many stupid benches w/o AA.

I'm guessing you didn't read the article and vr-zone felt too embarrassed to post AA benchmarks because the HD2900 had some driver bugs, making it stay below even the X1950 in terms of performance. :laugh:

ROFL @ bringing up driver bugs :roll:

I want a card that works well & with all capabilities in Vista.

Newsflash, nVidia has been sucking large nuts when it comes to drivers lately, especially in Vista, & unfortunately, i can't trust them to actually fix issues or release fixes, since improvements just aren't the nVidia way.

ATi has drivers issues now with the HD 2900, yes, i am well aware of that.
But at least i can count on a new set of drivers from them every month, & what's likely going to be much better support in Vista.

Pardon the my pissiness, but i'm not impressed with either choice i have right now :frown:

An 8800 GTX that's guaranteed to be a bugfest, or a poorer performing HD 2900 XT that's slower, even after a half a year delay :roll:

And don't start with the Vista bashing; some of us prefer an improved OS.
I am very unimpressed w/ both nV & ATi & review sites for ignoring the fact that XP = soon to be irrelevent, as every new PC sold in the last few months has Vista.

 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Image quality wise it's clear that ATI has better AA this gen (16x vs 16x, HD 2900 is better, with 24xAA it will be even better) and nVidia has *slightly* better AF.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: Extelleron
Image quality wise it's clear that ATI has better AA this gen (16x vs 16x, HD 2900 is better, with 24xAA it will be even better) and nVidia has *slightly* better AF.

What games will be able to run smoothly at a decent resolution with that level of AA? :confused:

Originally posted by: n7
Originally posted by: Nightmare225
Originally posted by: n7


Not much for high resolution results, where we already know the HD 2900 XT does better, & far too many stupid benches w/o AA.

I'm guessing you didn't read the article and vr-zone felt too embarrassed to post AA benchmarks because the HD2900 had some driver bugs, making it stay below even the X1950 in terms of performance. :laugh:

ROFL @ bringing up driver bugs :roll:

I want a card that works well & with all capabilities in Vista.

Newsflash, nVidia has been sucking large nuts when it comes to drivers lately, especially in Vista, & unfortunately, i can't trust them to actually fix issues or release fixes, since improvements just aren't the nVidia way.

ATi has drivers issues now with the HD 2900, yes, i am well aware of that.
But at least i can count on a new set of drivers from them every month, & what's likely going to be much better support in Vista.

Pardon the my pissiness, but i'm not impressed with either choice i have right now :frown:

An 8800 GTX that's guaranteed to be a bugfest, or a poorer performing HD 2900 XT that's slower, even after a half a year delay :roll:

And don't start with the Vista bashing; some of us prefer an improved OS.
I am very unimpressed w/ both nV & ATi & review sites for ignoring the fact that XP = soon to be irrelevent, as every new PC sold in the last few months has Vista.

You're going to have to wait for the refreshes, then. Which everybody pretty much is...
But, whatever, I'll continue gaming smoothly at max settings on my GTX while you grumble about driver support...

:D
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Extelleron
Image quality wise it's clear that ATI has better AA this gen (16x vs 16x, HD 2900 is better, with 24xAA it will be even better) and nVidia has *slightly* better AF.

You must be joking. It looks like they used "Vasoline AA". The screenshots so far show major blurring and obscuring of detail. I guess if you call "better AA" taking your contacts out or rubbing hot peppers into your eyes.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Nightmare225
Originally posted by: Extelleron
Image quality wise it's clear that ATI has better AA this gen (16x vs 16x, HD 2900 is better, with 24xAA it will be even better) and nVidia has *slightly* better AF.

What games will be able to run smoothly at a decent resolution with that level of AA? :confused:

Well, that remains to be seen. We don't know how the HD 2900XT will perform with high levels of AA because of the current bug that kills performance when AA is enabled. The huge amount of memory bandwidth 2900XT has and the fact ATI has always done well with AA makes me think 24xAA will be possible at 1680x1050 or 1920x1200, probably not at 2560x1600. nVidia's 16xQAA w/ Supersampling is unplayable at a 1680x1050 res on my 8800GTS anyway (except in CS:S.)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: Extelleron

Anyway, before everyone jumps on the "8800 IQ is better" bandwagon... I don't see the difference between any of those shots and apparently neither did the reviewer.... he said their MIGHT have been a very small difference in favor of nVidia... because of those words everyone on here starts talking about how sucky ATI IQ is.

Even the ones that were circled for you? And who circled them? Had to be the reviewer, yes?
So apparently, the reviewer DID see some difference. Which review are you referring to?

 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Wreckage
Originally posted by: Extelleron
Image quality wise it's clear that ATI has better AA this gen (16x vs 16x, HD 2900 is better, with 24xAA it will be even better) and nVidia has *slightly* better AF.

You must be joking. It looks like they used "Vasoline AA". The screenshots so far show major blurring and obscuring of detail. I guess if you call "better AA" taking your contacts out or rubbing hot peppers into your eyes.

Have you looked at the SS? I can tell myself that the ATI screenshots look better and VR-Zone agrees, even saying that it is CLEAR that ATI has the better AA. Meanwhile, you say that nVidia has much better AF because VR-Zone says the 2900's AF is a LITTLE rougher.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: keysplayr2003
Originally posted by: Extelleron

Anyway, before everyone jumps on the "8800 IQ is better" bandwagon... I don't see the difference between any of those shots and apparently neither did the reviewer.... he said their MIGHT have been a very small difference in favor of nVidia... because of those words everyone on here starts talking about how sucky ATI IQ is.

Even the ones that were circled for you? And who circled them? Had to be the reviewer, yes?
So apparently, the reviewer DID see some difference. Which review are you referring to?

The only screenshot where I can tell a noticeable difference (other than the ss's showing AA quality) is the one with the gun in BF 2142. nVidia is clearly better in that scenario, but I don't see such a difference, nor did the reviewers, anywhere else. They used a lot of "maybe" and "slightly" when describing nVidia being better than ATI.

Even though I'm not interested in buying it myself, I'm looking forward to the review of the HD 2600 series. It seems to be cut down a lot less than the 8600 is versus the 8800. The 2600XT has 120 shaders, 37.5% of the shaders the high end 2900XT has. The 8600 has 25% of the shaders the 8800 has. The 2600XT also runs at 800MHz. It'll have a 128-bit bus AFAIK but apparently is going to use GDDR4 memory.