SLI 7800GT to X1900XT Transition

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
From what I've seen from Anandtech's tests, my 7800GT SLI was always getting similar performance (and according to many benchmarks, even better depending on the game) when comparing it to a X1900XTX. I did get great performance as well as image quality with many games and I thought that I should wait for DX10 until I got another card. But, after seeing some 3Dmark scores from 1900xtx users I wondered what it would be like to play games w/ATI. My curiousity got me and I got a Sapphire 1900XTX. The following is what I've experienced so far in the transition. (Note: Switch was done properly e.g. unistalling previous drivers, rebooting, Driver Cleaner Pro, rebooting, installing new drivers, rebooting, running Driver Cleaner Pro again to nuke old drivers, rebooting.)

Okay, just got the x1900xtx today, and I've so far played it on FEAR, BF2, Far Cry, and am now attempting Oblivion. Couple of things that I am confused with. Through the ATI catalyst I've set AA to 4x, just plane 4x--no Adaptive AA or anything like that. Then I load up BF2 and it looks horrible, the AA wasn't even turned on. I go back and set it through the BF2 video settings and close it, reopen it, load again, and it still wasn't turned on. So I move both the Catalyst settings and BF2 settings up to 6x, just in case the catalyst ins't controlling the application I would hope that the application settings would kick in to. Loaded it and it still looked bad-wasn't on. So I set the AA to be application controlled from the catalyst and loaded it. That time it worked. SWEET gameplay and amazing image boost (especially w/the grass) w/vsync on without any problems unlike my SLI setup. I haven't even enabled triple buffering and I don't think I'll need to, FPS stays very close to the max 60. SLI caused some problems w/vsync enabled as it is syncrinizing two GPU's with the monitor, but I couldn't get DXTweaker to enable Triple Buffering on my SLI setup so I'm not going to closely compare the vsync issue between setups. (Still, much nicer the fact that I don't have to worry about enabling it) One card is sort of comforting--less things that could go wrong. Still the fact that the Catalyst couldn't change it has me bothered.

Also, in Far Cry it was slightly noticable (maybe because I was on a night map), but it proved to be true when I tried to set it in Oblivion. It says that I can't enable HDR and AA when I'm setting it in Oblivion's options before I even load the game. I thought ATI's 1900 series could do HDR+AA and Nvidia couldn't, except for Souce engines.

FEAR is f'n' sweet. Great performance, quality, and I was even sceptical with how it would play due to Anandtech's benchmarks between cards on this game. It is awesome still, if not better.

So far that is it, but I am completely new to the ATI scene and am learning as I go. ANY help/hints, etc. would be appreciated (especially with the HDR+AA issue).
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: josh6079
Also, in Far Cry it was slightly noticable (maybe because I was on a night map), but it proved to be true when I tried to set it in Oblivion. It says that I can't enable HDR and AA when I'm setting it in Oblivion's options before I even load the game. I thought ATI's 1900 series could do HDR+AA and Nvidia couldn't, except for Souce engines.

There's some issue with the way Oblivion does its HDR that prevents it from working properly. I have yet to see a good technical description of why it doesn't work, but it definitely doesn't according to many reports (and the developer).

It should work in Far Cry, though. Although I think you need the latest patch.
 

Elfear

Diamond Member
May 30, 2004
7,164
821
126
Originally posted by: josh6079
From what I've seen from Anandtech's tests, my 7800GT SLI was always getting similar performance (and according to many benchmarks, even better depending on the game) when comparing it to a X1900XTX. I did get great performance as well as image quality with many games and I thought that I should wait for DX10 until I got another card. But, after seeing some 3Dmark scores from 1900xtx users I wondered what it would be like to play games w/ATI. My curiousity got me and I got a Sapphire 1900XTX. The following is what I've experienced so far in the transition. (Note: Switch was done properly e.g. unistalling previous drivers, rebooting, Driver Cleaner Pro, rebooting, installing new drivers, rebooting, running Driver Cleaner Pro again to nuke old drivers, rebooting.)

Okay, just got the x1900xtx today, and I've so far played it on FEAR, BF2, Far Cry, and am now attempting Oblivion. Couple of things that I am confused with. Through the ATI catalyst I've set AA to 4x, just plane 4x--no Adaptive AA or anything like that. Then I load up BF2 and it looks horrible, the AA wasn't even turned on. I go back and set it through the BF2 video settings and close it, reopen it, load again, and it still wasn't turned on. So I move both the Catalyst settings and BF2 settings up to 6x, just in case the catalyst ins't controlling the application I would hope that the application settings would kick in to. Loaded it and it still looked bad-wasn't on. So I set the AA to be application controlled from the catalyst and loaded it. That time it worked. SWEET gameplay and amazing image boost (especially w/the grass) w/vsync on without any problems unlike my SLI setup. I haven't even enabled triple buffering and I don't think I'll need to, FPS stays very close to the max 60. SLI caused some problems w/vsync enabled as it is syncrinizing two GPU's with the monitor, but I couldn't get DXTweaker to enable Triple Buffering on my SLI setup so I'm not going to closely compare the vsync issue between setups. (Still, much nicer the fact that I don't have to worry about enabling it) One card is sort of comforting--less things that could go wrong. Still the fact that the Catalyst couldn't change it has me bothered.

Also, in Far Cry it was slightly noticable (maybe because I was on a night map), but it proved to be true when I tried to set it in Oblivion. It says that I can't enable HDR and AA when I'm setting it in Oblivion's options before I even load the game. I thought ATI's 1900 series could do HDR+AA and Nvidia couldn't, except for Souce engines.

FEAR is f'n' sweet. Great performance, quality, and I was even sceptical with how it would play due to Anandtech's benchmarks between cards on this game. It is awesome still, if not better.

So far that is it, but I am completely new to the ATI scene and am learning as I go. ANY help/hints, etc. would be appreciated (especially with the HDR+AA issue).


Sounds like you figured out the BF2 issue. I always set application preference in the ATI Control Panel unless the game doesn't natively support it.

Unfortunately, even the thought the X1*** series supports HDR + AA, Bethesda (the makers of Oblivion) decided not to include that option in the game. Hopefully a patch will enable that feature in the future.

If you want to enable HDR + AA in FarCry you have to download the special patch and than use a couple console commands. Just do a search for FarCry HDR Patch if you haven't already installed it.

I personally don't like using CCC so I recommend ATI Tray Tools. Just download the latest version here and remember to click the "Disable Overclocking " tab during the install process. I also recommend using ATI Tool Beta 14 found here. You can set fan speeds and overclock with it if you so desire.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Thanks for the info so far guys.

As for the Far Cry patch, I have it already. I'm going to go on a day map and try to see if it still does it or not. I just noticed on the edge of a mountain that when I turned on HDR through the counsle and had AA running already that the mountain edge looked choppyer. Like I said I'll try it on a different map and see.

As for the catalyst not overriding the application like it is supposed to, does that happen to everyone or what? AF isn't like that is it? Is it just with BF2?

Thanks again for the utility info Elfear, any other tips like that or overclocking suggestions would be greatly appreciated.

 

Golgatha

Lifer
Jul 18, 2003
12,396
1,068
126
I went recently from a 7800GT to an X1800XT 512MB myself. The difference in performance between the two is night and day when AA and AF get maxed out. I can now go up a resolution and enable at least 2x more AA than I was using and have smoother gameplay to boot. HDR seems to run much better on Source based games and IQ seems a bit better, although this perception could be from the enhanced resolutions and AA too.

However, I tried out a 6800GT AGP before moving to a PCIe system. I went back to an X850XT when I did the change over. The reason being is because I was playing Riddick EFBB at the time and even though the FPS seemed better, the game just looked more jagged/rigid IMO at the same resolution, AA, and AF. I felt the same way about BF2 as well.

That said, I used and loved my 7800GT however, because it was far and away faster at Doom III based games and I could actually run all games with high enough resolution, AA, and AF at playable FPS to get rid of any kind of jagged or rigid effects in games. The resolutions, AF, and AA I can run with the X1800XT makes games much more immersive and I think the IQ is top notch.

The only things which annoyed me with switching to ATI is the now mandatory CCC (nVidia drivers are sooo much more straightforward and easy to use) and that the ATI drivers treat my CRT like a TV, defaulting to 60Hz until you manually set the resolution and refresh higher in the driver panel. I suppose overclocking ATI boards seems like alchemy (disable service(s) xyz, then raise up voltage just a little more than you should because the 3rd part beta undervolts, don't open CCC, stand on one foot, and add eye of newt for good measure) compared to the ease of overclocking nVidia boards (coolbits, move sliders, test with ATI Tool fuzzy cube [ironically], done), which is really annoying. If I had more money and a Crossfire motherboard already installed, I would probably be miffed about X1800XT Crossfire board prices and availability too.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Yeah, I've noticed that HDR with out AA on ATI still looks better than HDR without AA on Nvidia. Any image for that matter looks better on ATI than Nvidia imho. Plus, the performance hit on the cards between companies is very different. I can enable more things and still play at the same (if not better) fps. So far, I'm very impressed with ATI and this X1900XTX. But like you said, driver issues is a turn off, its going to take me a while to see if they are really issues, I'm new to ATI. But as far as everything else, its awesome, especially for only being one GPU. Still need to figure out the best way to overclock though. I'm so used to Nvidia it will take a little bit to learn.
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Force HQAF 16x in the CCC and leave the AA to application controlled. This has worked flawlessly for me in every game.

If you're going to replay FarCry with the HDR+AA I highly recommed you add the 64bit texture pack. The game will look great.

Oblivion looks much better with HDR than bloom + AA. At 1920x1200, I haven't noticed any need for AA which is quite the opposite of BF2 which requires a fair amount of AA regardless of the Resolution. There is probably a 0% chance of a HDR+AA patch for oblivion. I hope to be proven wrong though!

As for the IQ, my x1900xtx seems to display a more crisp picture with more deeper colors than my friends 7800GTX, with the exact same monitor.

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Two things...

1) IMO, the CCC is fine (and easy) for casual OC'ers. Just use he sliders... Now, if you are a bit more into it and are using something other than air you might need to use some third party apps. For me, hitting 690/800 with my X1800XT was as easy as setting the sliders all the way up and forgetting about it. The .net requirement sucks, but isn't the end of the world.

2) I think the reaosn everyone says that everything looks better on ATI cards is simply because ATI's colors are warmer by default then NVIDIA's. Aside from the angle independent AA, there isn't any reason why ATI's IQ should look better than NV... The warmer colors seem to be more pleasing to most people though.

Force HQAF 16x in the CCC and leave the AA to application controlled. This has worked flawlessly for me in every game.

Can you use AAA this way or are you stuck with application controlled AA only?
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
What does AAA do even? On my Nvidia Setup I had a Gamma Correlation setting that corrects the gamma on AA edges, blending them closer together to give it a smoother image. Is this pretty much what AAA is supposed to do?

If you're going to replay FarCry with the HDR+AA I highly recommed you add the 64bit texture pack. The game will look great.

Is that the XP Pro 64 edition patch? Will that work on a regular 32 bit XP?

Also is CCC the regular driver panel catalyst for the ATI card, I suspect that it is but I'm unfamiliar with the terms.

Can you apply Temporal AA and Adaptive AA at the same time? I haven't messed with that yet and, if you could, should you use both at once?
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Originally posted by: josh6079
What does AAA do even? On my Nvidia Setup I had a Gamma Correlation setting that corrects the gamma on AA edges, blending them closer together to give it a smoother image. Is this pretty much what AAA is supposed to do?

If you're going to replay FarCry with the HDR+AA I highly recommed you add the 64bit texture pack. The game will look great.

Is that the XP Pro 64 edition patch? Will that work on a regular 32 bit XP?

Also is CCC the regular driver panel catalyst for the ATI card, I suspect that it is but I'm unfamiliar with the terms.

Can you apply Temporal AA and Adaptive AA at the same time? I haven't messed with that yet and, if you could, should you use both at once?


CCC = Catalyst Control Center. The interface you go into to tweak things.
Temporal AA = No one uses, that I know of
AAA = Adaptive AA. Ati's version of Nvidia's Transparency AA
http://forums.anandtech.com/messageview...atid=31&threadid=1762724&enterthread=y
Read my post(s) in there on how to properly get HDR+AA and The 64bit content patch working on a 32bit XP Pro install.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Cool, thanks for the help.

Also, I got the HDR+AA to work on Far Cry. It does take away from the AA effect a little bit, but you can tell that it is still on.

I thought that the 1900xtx comes with a memory clock of 1550 MHz. My CCC ATI Override Panel shows that the "Graphic Memory Status" under the "Current Memory Clock" at 594 MHz. The "Requested" is 774 Mhz. I think I'm missing something pretty simple but why can I not even go up to 1550 if I wanted to? Why does it top out at 800 on the slider bar?

Right now I've only done things through CCC.
 

Paratus

Lifer
Jun 4, 2004
17,583
15,708
146
Effective DDR MHZ = Actual X 2
so

774 X 2 = 1548 ~ 1550

the lower number is the 2D speed.


posted via Palm Life Drive
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Oh okay thanks.

Thanks Joker, you've been lots of help especially with explaining things.

Everyone else has been great help to. I'm am so suprised with the upgrade to this ONE card from my TWO 7800GT's. I'll probably ask some more questions later when my Zalman VF-900 gets here as I will really start to overclock.

To sum up most of my experience with this card so far, it is better than a 7800GT SLI. Tied, and sometimes even better, performance...Always better image quality, especially with that HQ AF,...less things to worry about, one card rather than two possible problems,...my chipset actually stays a little cooler now that I don't have two honker cards over the top area of it...added features that 7800's didn't have, HDR+AA,...cheaper in the long run just to get one of these than to buy two 7800's--I found out the hard way...less intensive vsync, only because there is one card though,...less performance hits when enabling special features.

Indeed, I'm very impressed. ATI done good.
 

Elfear

Diamond Member
May 30, 2004
7,164
821
126
I was impressed as well moving from two 7800GTs to my current card. They were even clocked at 550/1300.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
I am probably going to make the same video card switch soon, mainly so I can get vsync and TB working in everything. It's nice to hear of a good experience like this. :) (I was actually about to buy one last week but the latest ATI drivers are causing problems in one older game I play a lot, so I'm waiting to see what comes of that)
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
Originally posted by: RobertR1
Force HQAF 16x in the CCC and leave the AA to application controlled. This has worked flawlessly for me in every game.

If you're going to replay FarCry with the HDR+AA I highly recommed you add the 64bit texture pack. The game will look great.

Oblivion looks much better with HDR than bloom + AA. At 1920x1200, I haven't noticed any need for AA which is quite the opposite of BF2 which requires a fair amount of AA regardless of the Resolution. There is probably a 0% chance of a HDR+AA patch for oblivion. I hope to be proven wrong though!

As for the IQ, my x1900xtx seems to display a more crisp picture with more deeper colors than my friends 7800GTX, with the exact same monitor.

I second the AA to app preference and AF to 16x and High Quality ALWAYS enabled.
I had the exact switch as you and have WAY smoother gameplay with this 1 card than with SLI.

BTW, set adaptice AA to performance. In my experience every game I run can take it. High Quality adaptive AA is a REAL performance killer. Looks incredible, though.
My BF2 setting are: Ultra High (all 4s in the video.con) and 6x AA (in game) 16x HQAF (CCC) and 1280x960 and Performance adaptive AA. RARELY dips below 60 fps. And sometimes it's actually some lagg from people who live in China :p

Oblivion I use 1280x960 2xAA w/Performance adaptive AA, 16x HQAF with Bloom. Also use some small dual core tweaks.

FEAR, same as oblivion, same for COD2. Far cry I have a hard time trying to max out my card ;-)
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: josh6079
Oh okay thanks.

Thanks Joker, you've been lots of help especially with explaining things.

Everyone else has been great help to. I'm am so suprised with the upgrade to this ONE card from my TWO 7800GT's. I'll probably ask some more questions later when my Zalman VF-900 gets here as I will really start to overclock.

To sum up most of my experience with this card so far, it is better than a 7800GT SLI. Tied, and sometimes even better, performance...Always better image quality, especially with that HQ AF,...less things to worry about, one card rather than two possible problems,...my chipset actually stays a little cooler now that I don't have two honker cards over the top area of it...added features that 7800's didn't have, HDR+AA,...cheaper in the long run just to get one of these than to buy two 7800's--I found out the hard way...less intensive vsync, only because there is one card though,...less performance hits when enabling special features.

Indeed, I'm very impressed. ATI done good.



Glad you're having a good experience so far Josh. If you need any more help feel free to PM me.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: josh6079
Oh okay thanks.

Thanks Joker, you've been lots of help especially with explaining things.

Everyone else has been great help to. I'm am so suprised with the upgrade to this ONE card from my TWO 7800GT's. I'll probably ask some more questions later when my Zalman VF-900 gets here as I will really start to overclock.

To sum up most of my experience with this card so far, it is better than a 7800GT SLI. Tied, and sometimes even better, performance...Always better image quality, especially with that HQ AF,...less things to worry about, one card rather than two possible problems,...my chipset actually stays a little cooler now that I don't have two honker cards over the top area of it...added features that 7800's didn't have, HDR+AA,...cheaper in the long run just to get one of these than to buy two 7800's--I found out the hard way...less intensive vsync, only because there is one card though,...less performance hits when enabling special features.

Indeed, I'm very impressed. ATI done good.

Glad to hear this.

I had made the switch from two 7800GTX's to a single X1800XT in the past, and I wasn't disappointed. The performance wasn't quite as high, but neither was the cost. Although, at $650, the X1800XT was pretty steep, especially considering the fact that the X1900's came out about a month after I bought it.

I'm actually in the process right now of switching over from dual 7900GTXes (need to ship to buyer today) to a single X1900XT(X). The nice thing about the poor availability of the 7900's is that the resale value is pretty high - without being a gouger, I only lost about 5% on the cards. I think I'm going to go with either the HIS X1900XT or XTX from Monarch. They both have a $50 rebate, which will make it the cheapest video card I've bought all year. It has taken me a bit of time, a considerable amount of money, and a lot of posts on AT to figure out that high end dual card setups simply aren't worth it. Yes, they are certainly cool, and they do perform, but damn are they expensive.

Edit: when I say dual high end cards, I mean the high end... Dual 7900GT's isn't a bad bet for $750ish, but I still think a single $450ish X1900XT is a better value.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Yeah, I will put mine in when I get home. I really wish I could have found the HIS IceQ3 version of the XTX somewhere, but I wasn't able to find any retailers selling that version. The only thing I'm not looking forward to is that stock ATI fan.