Elfear
Diamond Member
- May 30, 2004
- 7,163
- 819
- 126
Originally posted by: Crusader
If you had minimum framerate numbers in foliage to back that up Ackmed, I'd believe you and drop it. No harm will be done to anyones ego. Unless yours lies in the balance? Mine does not.
Is that to much to ask out of you? Proof and evidence DOES sway people Ackmed. But you dont have it. Just this wild-eyed assertion that it "works for you" magically, while all credible sources not wholly devoted to ATI say otherwise.
I'm not trying to say HDR+AA is a bad thing, I'm glad for any tech that advances the gaming industry.
I'm only trying to say that given the lack of games and performance at high res it has little use at this point in time.
Someday HDR+AA may be a more important issue, but by then X1900s will likely be fairly slow cards.
I'll take you up on that challenge Crusader. Not because I don't think Ackmed can't do it on his own, but I'm actually curious what kind of performance I can get at 1920x1200 with HDR+AA. I should have some #'s ready to post by the end of the week.
As to your assertations:
"most power users that have rigs capable of HDR+AA, and also are even AWARE of the fact that ATI has hacked the game frequently use 24inch widescreen LCDs"
"High resolution, where the power users tend to game at on their 24inch LCDs (1920x1200) are where the numbers truley either make or break this HDR+AA Oblivion hack-job."
I think we should do a pole and see what resolution people with X1*** series cards use. I'd be willing to bet that 1280x1024 beats out 1920x1200 10:1. Certainly the ability to do HDR+AA at 1600x1200 and higher is impressive but not exactly mandatory when most people have a native res of 1280x1024. Your argument doesn't hold much water when Nvidia cards can't even do HDR at 1600x1200 and above and stay playable (according to your standards).