imported_spank
Member
- Apr 6, 2006
- 32
- 0
- 0
Originally posted by: Crusader
First of all, you guys are the ultimate spin doctors. SNIP
I posted some numbers above.
Originally posted by: Crusader
First of all, you guys are the ultimate spin doctors. SNIP
Originally posted by: Tangerines
Wow, that looks awesome. I'll have to spring for one of those cards soon to replace my aging 6800NU. Which should I get though? The X1800 series or the X1900 series? Advice would be appreciated.![]()
Originally posted by: Crusader
Its clear from ATIs very, very oversaturated HDR in Oblivion that either NV or ATI are not doing HDR completely properly. I'm guessing its ATI as its far far to washed out and bright to even be considered realistic...
Originally posted by: gersson
The performance drop from AA and HDR in oblivion is prob because they used AAA. I get a 5 fps drop from using 4xAA in outdoor scenes. No one should be using AAA for this game untill SLI next gen.
When a less biased site posts it, then it's good news. If I hear it from Tech Report, Hardocp, Xbit or even Rage3D I will find it more credible.Originally posted by: thilan29
I never said there are only NVidia biased sites but every time there is something good posted about ATI some fanboy comes in claiming bias.
Because ALL sites are reporting it peforming well, not just one.Did you see me claiming that all sites are biased towards NVidia because the 7950 performed well?? Didn't think so.
Someone has to counter all this FUD and BS that gets posted.You talking about hyprocrisy is hilarious when an obviously extremely biased person such as yourself claims the site is biased.
And how you lean. :thumbsdown:I think most people here know how you lean.
Originally posted by: Wreckage
When a less biased site posts it, then it's good news. If I hear it from Tech Report, Hardocp, Xbit or even Rage3D I will find it more credible.
Originally posted by: spank
Originally posted by: Wreckage
When a less biased site posts it, then it's good news. If I hear it from Tech Report, Hardocp, Xbit or even Rage3D I will find it more credible.
hardocp also thinks that hdr+aa is playable, link
Originally posted by: spank
Originally posted by: Wreckage
When a less biased site posts it, then it's good news. If I hear it from Tech Report, Hardocp, Xbit or even Rage3D I will find it more credible.
hardocp also thinks that hdr+aa is playable, link
Originally posted by: Wreckage
2XAA plus crossfire. :roll:
I would hope at the very least that should work.
Originally posted by: Crusader
I'll give credit where its due if ATI could push 19x12 in the intensive areas of Oblivion like Foliage, but they cannot even with Crossfire.
Let alone a reasonable purchase like a single XT or XTX.
crusader and Wreckage are in the same category : ) both em them want the roll for rolloOriginally posted by: spank
Originally posted by: Wreckage
2XAA plus crossfire. :roll:
I would hope at the very least that should work.
Sorry, mixed you and Crusader up
Originally posted by: Crusader
I'll give credit where its due if ATI could push 19x12 in the intensive areas of Oblivion like Foliage, but they cannot even with Crossfire.
Let alone a reasonable purchase like a single XT or XTX.
Originally posted by: Wreckage
When a less biased site posts it, then it's good news. If I hear it from Tech Report, Hardocp, Xbit or even Rage3D I will find it more credible.
Originally posted by: Crusader
I personally prefer games the way the developer intended, without ATIs invasive measures that they do in attempt to one-up Nvidia.
Thats really a non-issue to me though, as maybe someone likes burning out their corneas?
Bottom line is that I'll let Bethesda decide how Oblivion should look and work
And if more sites tested HDR+AA I'm willing to bet they would come to the same conclusion as FS, much like several members of this forum have. Testing of HDR+AA will of course be limited for reasons I think you can imagine.Originally posted by: Wreckage
Because ALL sites are reporting it peforming well, not just one.
Originally posted by: Wreckage
And how you lean. :thumbsdown:I think most people here know how you lean.
Originally posted by: akugami
Originally posted by: Tangerines
Wow, that looks awesome. I'll have to spring for one of those cards soon to replace my aging 6800NU. Which should I get though? The X1800 series or the X1900 series? Advice would be appreciated.![]()
Just make sure your PSU can support it. One of the negatives of the X1k series from ATI is they are power hungry like a mofo. It's a very good bang for the buck card. If you want lower than this, try to get an X1800XT 512MB or barring that a 7900GT.
Originally posted by: Tangerines
Originally posted by: akugami
Originally posted by: Tangerines
Wow, that looks awesome. I'll have to spring for one of those cards soon to replace my aging 6800NU. Which should I get though? The X1800 series or the X1900 series? Advice would be appreciated.![]()
Just make sure your PSU can support it. One of the negatives of the X1k series from ATI is they are power hungry like a mofo. It's a very good bang for the buck card. If you want lower than this, try to get an X1800XT 512MB or barring that a 7900GT.
Alright, I have a 500W PSU, so that should be adequate. I'll probably go for the X1800XT 512. Thanks!![]()
Do you also play Oblivion with the ugly-ass distance textures? If the devs intended me to pay $50 for a game and have it look like N64 gfx on a $500 video card, then I got a place where to shove their intentions.Originally posted by: Crusader
I personally prefer games the way the developer intended, without ATIs invasive measures that they do in attempt to one-up Nvidia.
No, the chuck patch increases IQ. The FX cheats decreased IQ, that's not even remotely similar.This is the sort of business that NV also did during the FX days. To be fair, its nothing short of Quack 3 either. For those who remember that. Or ATI trylinear.
Then you'd be guessing wrong. Find me one official review that says Ati HDR is oversaturated compared to Nv HDRIts clear from ATIs very, very oversaturated HDR in Oblivion that either NV or ATI are not doing HDR completely properly. I'm guessing its ATI as its far far to washed out and bright to even be considered realistic.
That's because you've never seen HDR+AA in action.Thats really a non-issue to me though, as maybe someone likes burning out their corneas? I dont like ATIs HDR implementation though with or without AA from the screenshots that Keys hosted. ATI HDR, with or without AA/AF is truley not "HDR done right".![]()
And it's not like you have a choice. Other people don't like to settle for less.Bottom line is that I'll let Bethesda decide how Oblivion should look and work.
Originally posted by: redbox
And maybe some people like poking out their eyes with bad AF? Come on quite being melodramatic.
Originally posted by: Ackmed
The same prople trolling, and trying to spread misinformation I see. Typically, its the same few people. Jealousy is a bad thing. Calling FS bias, with zero proof is pure ignorance.
FS always shows 1280x1024 and the minimum, not 1600x1200. 1280x1024 is still be far the most used res in gaming. While I have not used it in a long time, many still do. They also have never tested 1920x1200 as far as I can remember, in fact, most reviews do not show anything higher than 1600x1200. I get very playable frames at 1920x1200 with HDR+AA, no matter what the trollers say.