While I don't agree with the thread title, I do agree with you on this point.Originally posted by: nemesismk2
I was just getting annoyed because of the lack of new games used in the 2900 XT reviews. I am sick of seeing Quake 4, Prey, Doom 3 used in reviews because it's boring and they are outdated! :disgust:
Originally posted by: yacoub
GOSH I WAS SO SICK OF SEEING OLD GAMES TESTED LIKE SUPREEM COMANADAR AND STALKER WTF AMD FTMFHWFAA* *(FOR THE MOTHE F'ION HELICOPTER WIN FRIGGIN AIR ASSAULT) WHAT. SO MUCH BETTER NOW I CAN PRETEND THESE STATS FROM RAINBROW SICK VEAGAS AFFECT ALL GAMES I PLAY LIKE COUNTARTSRIEK AND CONAN THE LIBRARIAN. RAWRRRRR.
-------------------------
KNOWN AEGATIAMDNVIDIA VITAL MARKETEERS: UR MOM, MY MOM, UR FACE
-------------------------
sorry couldn't resist. =)
Originally posted by: ayabe
This quote pretty clearly outlines exactly what all the fuss is about, brought to you by some great Italian to English translation:
"We conclude with a reflection: 4-6 months of retard are a lot, who was waiting for a graphic card comparable with the 8800 GTX and Ultra, now knows that has to wait more. Ok, we're talking of different price bands, but now, retard for retard, it was probably better to wait some more and have a top product, in line with the concurrent?"
Text
So basically, retard for retard, we got hosed.
Originally posted by: Extelleron
Originally posted by: Matt2
LOL.
You're saying that HD2900XT stomps 8800GTX in a bad port? Good one.
This is the same game that the 8800GTS 320mb scores 1 fps less than the GTX at 1600x1200. Looks like pretty reliable results there. :roll:
How about STALKER?
1920x1200 everything maxed
8800GTX- 51 fps average
8800GTS 640mb- 42 fps average
HD2900XT- 35 fps average
Is STALKER new enough game for you? Perhaps one not based off of a crappy port?
Let me guess. "Wait for the magic driver".
:frown:
The Inquirer review, which tested at 2560x1600 16x/16x in STALKER, showed the 2900XT ahead of the 8800GTS. Overclocked it was not far behind an 8800GTX @ 625/2000.
Rainbow Six may be a "crappy port" but it's based on the Unreal Engine 3. If the HD 2900XT shines at it, it's a good bet that performance in other UE3 titles (some of which are some of the most anticipated games of this year) will be good as well.
Originally posted by: yacoub
GOSH I WAS SO SICK OF SEEING OLD GAMES TESTED LIKE SUPREEM COMANADAR AND STALKER WTF AMD FTMFHWFAA* *(FOR THE MOTHE F'ION HELICOPTER WIN FRIGGIN AIR ASSAULT) WHAT. SO MUCH BETTER NOW I CAN PRETEND THESE STATS FROM RAINBROW SICK VEAGAS AFFECT ALL GAMES I PLAY LIKE COUNTARTSRIEK AND CONAN THE LIBRARIAN. RAWRRRRR.
-------------------------
KNOWN AEGATIAMDNVIDIA VITAL MARKETEERS: UR MOM, MY MOM, UR FACE
-------------------------
sorry couldn't resist. =)
Originally posted by: Fallen Kell
Originally posted by: Extelleron
Originally posted by: Matt2
LOL.
You're saying that HD2900XT stomps 8800GTX in a bad port? Good one.
This is the same game that the 8800GTS 320mb scores 1 fps less than the GTX at 1600x1200. Looks like pretty reliable results there. :roll:
How about STALKER?
1920x1200 everything maxed
8800GTX- 51 fps average
8800GTS 640mb- 42 fps average
HD2900XT- 35 fps average
Is STALKER new enough game for you? Perhaps one not based off of a crappy port?
Let me guess. "Wait for the magic driver".
:frown:
The Inquirer review, which tested at 2560x1600 16x/16x in STALKER, showed the 2900XT ahead of the 8800GTS. Overclocked it was not far behind an 8800GTX @ 625/2000.
Rainbow Six may be a "crappy port" but it's based on the Unreal Engine 3. If the HD 2900XT shines at it, it's a good bet that performance in other UE3 titles (some of which are some of the most anticipated games of this year) will be good as well.
Too bad when you overclock the GTS or GTX the overclocked 2900XT goes back to its original place....
Originally posted by: ronnn
Originally posted by: keysplayr2003
Six months or more of pent up ATI fans gone all postal. I guess we were bound to see a few threads like this. Just didn't expect it from nemisis. They need to grab onto any sliver or shread they can lay their mouse pointers on. I can't blame them, I'd be irritable too having to wait so long for something like this to happen.
Take it easy nem, you'll blow something..![]()
In all fairness though, the 2900XT was meant to compete with GTS. The 65nm version XTX should compete with a GTX. I know, I know. Too late and it has become moot.
Thank gawd you are here to pimp nvidia by accusing ati fans of going postal. Some things never change. :thumbsup:
Right now the 2900xt looks better in most ways than the 640mb gts, but is loud and uses more power. Driver improvements will likely make the 2900xt look even better, but that won't help power usage. So as most reviewers say, things are a mixed bag - but not as grim as the keys and the nv team would suggest. Still is nice to hear they have finally realized subtle iq differences do count.
lets get some dx10 gamining goodness to make these cards work.
Originally posted by: Extelleron
Originally posted by: Wreckage
Originally posted by: Extelleron
Originally posted by: Wreckage
"legitreviews" LOL. Illegit
Rainbow Six with no AA. Clearly this thread is meant as a joke or something.
I can find reviews that show the GTS320 a $260 card beating the 2900, maybe there should be a thread about that?
You can't run RB6:V with AA, genius. It doesn't work on any card.
Exactly the point. The 2900 takes a beating when AA is used in other games. So the lack of it here actually help it.
However Rainbow Six is a crappy console port. That is no indication of how a UT3 engine game written for the PC will work (except in your rapidly shrinking world).
That is in CERTAIN games due to a driver bug when enabling AA that has been widely documented in several reviews. It will be fixed.
That's right, R600 doesn't have hardware dedicated to resolving MSAA in the render back end - the only MSAA related tasks handled in the render back end are compression and evaluation of the subpixels. All antialiasing resolve is performed on the shader hardware. Certainly, AMD would prefer we start by telling you about the neat custom resolve filters that can be implemented on their shader hardware, but we would rather speculate about this for a moment first.