imported_Crusader
Senior member
- Feb 12, 2006
- 899
- 0
- 0
Originally posted by: Crusader
No thanks RobertR1, but it's interesting you call men "sweetie" and want them to "sit on your lap".
It's also interesting you played 75 hours of Oblivion at a setting HardOCP deemed "unplayable".
http://enthusiast.hardocp.com/article.html?art=MTA4Myw1LCxoZW50aHVzaWFzdA
Originally posted by: ST
In my testing of an x1900xt, i found a discernable difference when enabling HDR + AA. THese numbers were Frapped directly from a set course i used in Oblivion outdoors (10 min run) at 1920x1080p (my LCDTV's native resolution) w/ all image settings maxed:
HDR w/ No AA 8x AF :
ATI Stock 621MHz Core - 720MHz Mem (HQAF)
Frames, Time (ms), Min, Max, Avg
13968, 543034, 2, 57, 25.722
ATI OC 655MHz Core - 792MHz Mem (HQAF)
Frames, Time (ms), Min, Max, Avg
14778, 540610, 16, 58, 27.336
HDR-4X AA 8x HQAF :
ATI Stock 621MHz Core - 720MHz Mem
Frames, Time (ms), Min, Max, Avg
10539, 562581, 0, 36, 18.733
ATI OC 655MHz Core - 792MHz Mem
Frames, Time (ms), Min, Max, Avg
11492, 549567, 12, 45, 20.911
At higher resolutions, you will see quite a detrimental loss in framerate. NOte that I'm not knocking the X1900s, as I prefer the superior HDR+AA features, but some folks are really exagerating the performance delta. I would assume the next gen of vid cards (x1950xt?) w/ the higher clock rates will rectify this though, as evidenced by the OC fps numbers.
Hey, it's not just me and HardOCP saying it. Our own ST has shown us how much of a slide show Oblivion is at those settings on a X1900XT clocked at 655/792.
Don't they warn kids with epilepsy about flashing strobe lights like that?