- Mar 25, 2010
- 6,604
- 561
- 126
I know this subject was discussed in other threads and I know of the new changes to the Cat AI settings, but I'm a tad confused and just want some clarification.
For starters are there any reviews using the Cat 10.10 drivers versus older drivers?
I just recently upgraded from 10.3 (possibly 10.4) to 10.10e and I noticed a performance drop in Batman: AA. I haven't checked other games yet but seeing a drop in that one was enough to set a red flag.
I just recently played through batman: AA all settings max 8xAA with 1920x1080 resolution. I ran the bench and scored mins 59 and max 61 with overall 60 (using D3D override for V-sync and triple buffering).
I installed the new 10.10 last night set Cat AI to HQ and ran the bench didn't change anything except the resolution (since I was running my monitor and not my big screen) to 1680x1050 and to my surprise my minimum dropped to 38. I was like "WTF?" considering I'm using a lower resolution.
So I changed Cat AI to Quality and my min was no 48. Changed Cat AI to Performance and my min was 49.
Before my Min was 59. Is there a known performance decrease with Cat 10.10. And just from my own experience there is a noticeable IQ diff in IQ from Quality to High Quality.
EDIT:
If it matters:
PC
ASROCK X58 Extreme
Intel Core i7 930 stock
GA Radeon HD 5870 2GB Eye6
3x2 GB G.Skill Black Pi series DDR3 1600mhz
Corsair 750W
For starters are there any reviews using the Cat 10.10 drivers versus older drivers?
I just recently upgraded from 10.3 (possibly 10.4) to 10.10e and I noticed a performance drop in Batman: AA. I haven't checked other games yet but seeing a drop in that one was enough to set a red flag.
I just recently played through batman: AA all settings max 8xAA with 1920x1080 resolution. I ran the bench and scored mins 59 and max 61 with overall 60 (using D3D override for V-sync and triple buffering).
I installed the new 10.10 last night set Cat AI to HQ and ran the bench didn't change anything except the resolution (since I was running my monitor and not my big screen) to 1680x1050 and to my surprise my minimum dropped to 38. I was like "WTF?" considering I'm using a lower resolution.
So I changed Cat AI to Quality and my min was no 48. Changed Cat AI to Performance and my min was 49.
Before my Min was 59. Is there a known performance decrease with Cat 10.10. And just from my own experience there is a noticeable IQ diff in IQ from Quality to High Quality.
EDIT:
If it matters:
PC
ASROCK X58 Extreme
Intel Core i7 930 stock
GA Radeon HD 5870 2GB Eye6
3x2 GB G.Skill Black Pi series DDR3 1600mhz
Corsair 750W
Last edited: