From what I've seen from Anandtech's tests, my 7800GT SLI was always getting similar performance (and according to many benchmarks, even better depending on the game) when comparing it to a X1900XTX. I did get great performance as well as image quality with many games and I thought that I should wait for DX10 until I got another card. But, after seeing some 3Dmark scores from 1900xtx users I wondered what it would be like to play games w/ATI. My curiousity got me and I got a Sapphire 1900XTX. The following is what I've experienced so far in the transition. (Note: Switch was done properly e.g. unistalling previous drivers, rebooting, Driver Cleaner Pro, rebooting, installing new drivers, rebooting, running Driver Cleaner Pro again to nuke old drivers, rebooting.)
Okay, just got the x1900xtx today, and I've so far played it on FEAR, BF2, Far Cry, and am now attempting Oblivion. Couple of things that I am confused with. Through the ATI catalyst I've set AA to 4x, just plane 4x--no Adaptive AA or anything like that. Then I load up BF2 and it looks horrible, the AA wasn't even turned on. I go back and set it through the BF2 video settings and close it, reopen it, load again, and it still wasn't turned on. So I move both the Catalyst settings and BF2 settings up to 6x, just in case the catalyst ins't controlling the application I would hope that the application settings would kick in to. Loaded it and it still looked bad-wasn't on. So I set the AA to be application controlled from the catalyst and loaded it. That time it worked. SWEET gameplay and amazing image boost (especially w/the grass) w/vsync on without any problems unlike my SLI setup. I haven't even enabled triple buffering and I don't think I'll need to, FPS stays very close to the max 60. SLI caused some problems w/vsync enabled as it is syncrinizing two GPU's with the monitor, but I couldn't get DXTweaker to enable Triple Buffering on my SLI setup so I'm not going to closely compare the vsync issue between setups. (Still, much nicer the fact that I don't have to worry about enabling it) One card is sort of comforting--less things that could go wrong. Still the fact that the Catalyst couldn't change it has me bothered.
Also, in Far Cry it was slightly noticable (maybe because I was on a night map), but it proved to be true when I tried to set it in Oblivion. It says that I can't enable HDR and AA when I'm setting it in Oblivion's options before I even load the game. I thought ATI's 1900 series could do HDR+AA and Nvidia couldn't, except for Souce engines.
FEAR is f'n' sweet. Great performance, quality, and I was even sceptical with how it would play due to Anandtech's benchmarks between cards on this game. It is awesome still, if not better.
So far that is it, but I am completely new to the ATI scene and am learning as I go. ANY help/hints, etc. would be appreciated (especially with the HDR+AA issue).
Okay, just got the x1900xtx today, and I've so far played it on FEAR, BF2, Far Cry, and am now attempting Oblivion. Couple of things that I am confused with. Through the ATI catalyst I've set AA to 4x, just plane 4x--no Adaptive AA or anything like that. Then I load up BF2 and it looks horrible, the AA wasn't even turned on. I go back and set it through the BF2 video settings and close it, reopen it, load again, and it still wasn't turned on. So I move both the Catalyst settings and BF2 settings up to 6x, just in case the catalyst ins't controlling the application I would hope that the application settings would kick in to. Loaded it and it still looked bad-wasn't on. So I set the AA to be application controlled from the catalyst and loaded it. That time it worked. SWEET gameplay and amazing image boost (especially w/the grass) w/vsync on without any problems unlike my SLI setup. I haven't even enabled triple buffering and I don't think I'll need to, FPS stays very close to the max 60. SLI caused some problems w/vsync enabled as it is syncrinizing two GPU's with the monitor, but I couldn't get DXTweaker to enable Triple Buffering on my SLI setup so I'm not going to closely compare the vsync issue between setups. (Still, much nicer the fact that I don't have to worry about enabling it) One card is sort of comforting--less things that could go wrong. Still the fact that the Catalyst couldn't change it has me bothered.
Also, in Far Cry it was slightly noticable (maybe because I was on a night map), but it proved to be true when I tried to set it in Oblivion. It says that I can't enable HDR and AA when I'm setting it in Oblivion's options before I even load the game. I thought ATI's 1900 series could do HDR+AA and Nvidia couldn't, except for Souce engines.
FEAR is f'n' sweet. Great performance, quality, and I was even sceptical with how it would play due to Anandtech's benchmarks between cards on this game. It is awesome still, if not better.
So far that is it, but I am completely new to the ATI scene and am learning as I go. ANY help/hints, etc. would be appreciated (especially with the HDR+AA issue).