From X850XT to 7800GT to X1800XT to 7900GT?

imported_Airjarhead

Senior member
Dec 29, 2004
485
0
0
A brief History:
I have been an ATI fan for a long time.
I have also loathed Nvidia for:
1) putting 3dfx out of business
2) Their inferior IQ (in the past)
3) Their marketing stategies

These are the video card I've owned since my beloved Voodoo5:
Geforce 3 (for about a week)
KyroII
Radeon 8500 (for about a 2 months before they had good drivers)
Geforce TI-4400
9700Pro (My reason for loving ATI and their AA that was just as good as 3dfx)
9800Pro
X800XL
X850XT
7800GT (for 2 weeks just as an experiment)
X1800XT 512MB

I have to admit, he 7800GT really surprised me. Nvidia has really improved their IQ, and their AA! I thought BF2 looked MUCH better on the 7800Gt than on my X850XT. The 7800s AA in BF2 was head and shoulders above ATIs (especially at long distances). For the most part the 7800GT was faster too. However, the 7800GT experienced a lot more slow downs in BF2 than the X850XT (especially in smoke).
So, when the 7900GT came out, I researched A LOT to decide between the X1800XT (for $289 for the 256MB or $339 for the 512MB), or one of the 7900GTs. I can't tell you how many times I've had one of every kind in my shopping cart. I'm actually killing myself with regret by not getting the EVGA CO for $299 at Newegg that I had in my cart on luanch day. Anyway, from all of the benchmarks I decided the X1800XT 512MB was the way to go.
I have to say, I'm not impressed at all with the X1800XT. The IQ is the exact same as the X850XT, and while it is faster, it doesn't impress me. I play BF2 at 12x9 with 4xAA now vice 10x7 with 4xAA (with the X850XT), but it doesn't look that much better. I really think the 7800GT looked better. I'm also very dissapointed with all the crap you have to go through just to play a damn game on the X1800XT. I have ATI tool to adjust the fan speed (because it doesn't adjust on it's own, and will overheat if I don't), and to OC, and voltage(for some reason this card is downclocked @1.2V in 2d, and is supposed to go to 1.4V in games when using CCC). Then I use ATI Tray tools to adjust settings (AA, AF, etc). The fan litterally sounds like a hairdyer on high. What the hell is going on? Is this what things have come to? Am I asking too much by wanting to boot up and just click on a game .exe and play? Don't even get me started on CCC - IT SUXXXXXX! Don't even try to say it's any good at all. It SUX, and ATI should be ashamed that they ever made it, and tried to make us use it. It uses up at least 10MB of ram, and how much space (including the POS .net that's REQUIRED)?

That's my opinion. If I get flamed, I get flamed.


So now, I'm thinking about the 7900GT. The other games I play are NHL06, Lock-On, and Falcon 4 AF - all of which seem to get better performance with Nvidia cards.
I am worried that the 2nd and 3rd round of 7900Gts won't overclock as well as the first round (especially the memory).
I've read the threads about people going from Nvidia to ATI, and saying the IQ is better on the ATI, but I just don't see it. Maybe we value different things.
I will say this: AA quality and performance is one of THE most important things to me. I don't even look at benchmarks without AA.


Thanks,

 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
nvidia IQ better than ATI, WTF. I came from a 7800 GTX to a x1900 and let me tell you, you are mistaken. I let you have your opinion though. Take a look at farcry screenies(shadow jaggies on nvidia, better water replication ati , hdr+aa ati).

Oh and who was forcing you to change the voltages and s#it on your x1800, I dont see it in my manual that you have to do that to play different games. Do you even have that option in you forceware drives. um nope
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: Airjarhead
A brief History:
I have been an ATI fan for a long time.
I have also loathed Nvidia for:
1) putting 3dfx out of business
2) Their inferior IQ (in the past)
3) Their marketing stategies

These are the video card I've owned since my beloved Voodoo5:
Geforce 3 (for about a week)
KyroII
Radeon 8500 (for about a 2 months before they had good drivers)
Geforce TI-4400
9700Pro (My reason for loving ATI and their AA that was just as good as 3dfx)
9800Pro
X800XL
X850XT
7800GT (for 2 weeks just as an experiment)
X1800XT 512MB

I have to admit, he 7800GT really surprised me. Nvidia has really improved their IQ, and their AA! I thought BF2 looked MUCH better on the 7800Gt than on my X850XT. The 7800s AA in BF2 was head and shoulders above ATIs (especially at long distances). For the most part the 7800GT was faster too. However, the 7800GT experienced a lot more slow downs in BF2 than the X850XT (especially in smoke).
So, when the 7900GT came out, I researched A LOT to decide between the X1800XT (for $289 for the 256MB or $339 for the 512MB), or one of the 7900GTs. I can't tell you how many times I've had one of every kind in my shopping cart. I'm actually killing myself with regret by not getting the EVGA CO for $299 at Newegg that I had in my cart on luanch day. Anyway, from all of the benchmarks I decided the X1800XT 512MB was the way to go.
I have to say, I'm not impressed at all with the X1800XT. The IQ is the exact same as the X850XT, and while it is faster, it doesn't impress me. I play BF2 at 12x9 with 4xAA now vice 10x7 with 4xAA (with the X850XT), but it doesn't look that much better. I really think the 7800GT looked better. I'm also very dissapointed with all the crap you have to go through just to play a damn game on the X1800XT. I have ATI tool to adjust the fan speed (so the card doesn't overheat in games), and to OC (and adjust the voltage for games). Then I use ATI Tray tools to adjust settings (AA, AF, etc). What the hell is going on? Is this what things have come to? Am I asking too much by wanting to boot up and just click on a game .exe and play? Don't even get me started on CCC - IT SUXXXXXX! Don't even try to say it's any good at all. It SUX, and ATI should be ashamed that they ever made it, and tried to make us use it. It takes up about 10MB of ram, and how much space (including the POS .net that's REQUIRED).

That's my opinion. If I get flamed, I get flamed.


So now, I'm thinking about the 7900GT. The other games I play are NHL06, Lock-On, and Falcon 4 AF - all of which seem to get better performance with Nvidia cards.
I am worried that the 2nd and 3rd round of 7900Gts won't overclock as well as the first round (especially the memory).
I've read the threads about people going from Nvidia to ATI, and saying the IQ is better on the ATI, but I just don't see it. Maybe we value different things.
I will say this AA quality and performance is one of THE most important things to me. I don't even look at benchmarks without AA.


Thanks,

So you card overheats in games but it doesnt? You think that the AA is better on the 7900GT is is proven to be wrong. You think the IQ is better which is proven to be wrong. You think 10MB of ram is to much when you are a complete idiot because I bet you have the windows XP crazy start bar and icon shadows. I am sorry you are disappointed in your purchase, I am sorry you are not smart, I am sorry that you scream like a girl on a forum.
 

imported_Airjarhead

Senior member
Dec 29, 2004
485
0
0
The X1800XT I have runs at 1.2V in 2d and 1.4V in 3d. If I turn off the Service that does this for me, I have to do it manually (ATI Tool won't work if you have the service tuned on). Also, if I don't adjust the fan speed, it stays too low and games will overheat. So I have to put it on Hairdryer mode before I game.
No, I don't have all that XP ****** turned on. I have all shadows, sliding, etc off and all unecessary services off. Thanks for jumping to the wrong conclusions though.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
ATI's fans on their X18xx and X19xx series are crap. Didn't you find the X850XT to be a mini-hairdryer as well? I went from an X850 to a 7800GT, and on both I went to aftermarket cooling. It's not much, $20-30 to get near silent performance with better cooling to boot.

For so many of these 'b1tch and moan threads, I think the OP would be much happier if they just bit the bullet and bought either an Arctic Cooling solution or a Zalman solution for their video card.
 

Elfear

Diamond Member
May 30, 2004
7,164
821
126
Originally posted by: Airjarhead
The X1800XT I have runs at 1.2V in 2d and 1.4V in 3d. If I turn off the Service that does this for me, I have to do it manually (ATI Tool won't work if you have the service tuned on). Also, if I don't adjust the fan speed, it stays too low and games will overheat. So I have to put it on Hairdryer mode before I game.
No, I don't have all that XP ****** turned on. I have all shadows, sliding, etc off and all unecessary services off. Thanks for jumping to the wrong conclusions though.


THe only reason you'd have to mess with 2D/3D volts is if you're overclocking (which it looks like you're doing). For normal operation there is no need. I personnaly consider the voltage adjustments a boon. With Nvidia cards you don't get the option at all.

As far as the fan speed and overheating issue, how hot is your gpu getting? Do you have adequate airflow through your case? Most of the reports from users I've seen say that putting the gpu fan at ~40-50% will keep the gpu cool enough to game with. The exceptions have been users with poor airflow, users who live in Florida (ie somewhere very hot), or a few defective cards where the hsf didn't contact the gpu very well.

The X1800XT's do run a tad hotter than other high end cards but nothing to be worried about. My 7800GTX ran at ~80-85C under load as did my X850XT PE.
 

Bull Dog

Golden Member
Aug 29, 2005
1,985
1
81
Originally posted by: Airjarhead
The X1800XT I have runs at 1.2V in 2d and 1.4V in 3d. If I turn off the Service that does this for me, I have to do it manually (ATI Tool won't work if you have the service tuned on). Also, if I don't adjust the fan speed, it stays too low and games will overheat. So I have to put it on Hairdryer mode before I game.
No, I don't have all that XP ****** turned on. I have all shadows, sliding, etc off and all unecessary services off. Thanks for jumping to the wrong conclusions though.

Use the dynamic fan speed control. That way you won't have to manually set it in "hairdrier" mode.
 

russki

Senior member
Nov 7, 2000
640
0
0
I just bought a 7900GT from newegg, a Biostar LMAO. I outfitted it with a Zalman VF700 and was able to overclock the card to 535/1568 easily. Havent even tried to go higher. There also exist an easy voltmod for the 7900GT which can boost the clocks past 600mhz. This card is very nice! I used to own a 6800GS and this thing blows it out of the water. I just played about an hour of Half Life 2 at the above clocks at 1440x900 resolution with 4xAA and everything is as smooth as silk. The previous card did not feel very smooth and I had to turn off AA. The ATI cards I'm sure are very good as well, but I couldnt see myself spending at extra $150. By the way there are only two games that support HDR+AA anyways and Farcy is one of them. If you play farcry a lot then I would suggest you get the ati card, but otherwise its a no brainer.
 

imported_Airjarhead

Senior member
Dec 29, 2004
485
0
0
THe only reason you'd have to mess with 2D/3D volts is if you're overclocking (which it looks like you're doing). For normal operation there is no need. I personnaly consider the voltage adjustments a boon. With Nvidia cards you don't get the option at all.

As far as the fan speed and overheating issue, how hot is your gpu getting? Do you have adequate airflow through your case? Most of the reports from users I've seen say that putting the gpu fan at ~40-50% will keep the gpu cool enough to game with. The exceptions have been users with poor airflow, users who live in Florida (ie somewhere very hot), or a few defective cards where the hsf didn't contact the gpu very well.

The X1800XT's do run a tad hotter than other high end cards but nothing to be worried about. My 7800GTX ran at ~80-85C under load as did my X850XT PE.

I'm sorry, but the X1800XT DOES have two different modes of operation (at least mine does). 2D and 3D. It is clocked at 594/693 @ 1.2V in 2d. Then when you start a 3d app, it is supposed to switch to 3D mode (625/750 at 1.4V). In order to OC with ATItool (and change the fan settings), I had to disable the service that does this switch. Trust me, If I didn't have to change the voltages, I wouldn't. I read it in a bunch of forums after I installed the card and had problems.
My Case airflow is excellent. My GPUs never overheated in the past and my cpu idles in the high 20s.
As for aftermarket cooling, I used an Arctic Silencer 5 on my X850XT, but nothing by Arctic fits on the X1800XT. The Accelero is NOT an option as I have a DFI mobo, and it will blow hot air on my chipset. I have ordered an NV silencer 5 that can fit both the X1800XT AND 7900Gt with some modification.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Airjarhead

I will say this: AA quality and performance is one of THE most important things to me. I don't even look at benchmarks without AA.

i think I read that NV has slightly better AA, and ATi has better HQ AF. So if you are that sensitive to slight variations in AA quality, then Nvidia is the better card for you. Although for all intentions and purposes, once you crank the resolution to 1600x1200, I can hardly imagine you noticing the difference when you play games. In comparisons for BF2, the reviewers often zoom in at 100-200% just to show the difference. With the ATI card you can also crank AA to 6AA/8AA modes if i am not mistaken. It'll have less of a performance hit than NV's 8AA mode.

As far as cooling, it sucks on Ati cards. But with aftermarket fans like Zalman VF700 that tend to be universal for most cards, it's a good long-term investment.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Yes, if you're OC'ing, then you have to manually adjust the voltages. But, with Ati Tool you can create separarate 2d and 3d profiles, and have the 3d profile load automatically when it's needed, so after the initial setup there's no reason to be adjusting voltages and fanspeed everytime you want to run a game. Also, there's a bios available for the x1800xt that will set it to run at 700mhz, and you can buy aftermarket cooler to lower the temps and noise. With my x1900xt, I just flashed the xtx bios, installed a zalman vf900, and now it runs cooler and much quieter than with stock cooling, not to mention faster. I dont even do any manual overclocking because it wont go much past xtx speeds, but that's good enough for me.
 

BassBomb

Diamond Member
Nov 25, 2005
8,390
1
81
maybe ur not settings the transperancy AA in the x1800xt hence it looking not as good...

no wonder ur surprised by nvidias quality yearrrrss later after u havent bought a nvidia card in so long
 

imported_Airjarhead

Senior member
Dec 29, 2004
485
0
0
There is no transparancy AA for ATI. There is Adaptive AA, and it does a better job at getting most jaggies, but it "crawls" more than the regular AA.
There is no 8xAA for ATi, and it doesn't matter how far you crank up AA in BF2 on an ATI, it doesn't do as good a job as it does in other games. Maybe it's because you can only adjust AA settings in game using an ATI card (it won't work from CCC, Control Panel, or ATT), so you are using Dice's AA? However, if you play Gulf of Oman on Single player (just to look around at things) with 4xAA and 12x9 resolution - it looks terrible:
the ground is jagged
the crane in the distance is jagged
Even at 16x12 with 4xAA, it's NOT much better.
As I said in my original thread, I think the AA in BF2 is done better with Nvidia than ATI - I didn't say it was that way for all games (like some people seem to think I'm saying). But I did say I think Nvidia HAS improved it's AA in all games.

In other games you can change the AA via drivers, and ATI's AA does perform better than Nvidias
 

imported_Airjarhead

Senior member
Dec 29, 2004
485
0
0
The more I think about it...
I think the bad AA in BF2 IS due to the fact that we can't enable it in drivers (at least I have to enable it in-game), so we are using the built in Dice AA vice ATI's AA. However with Nvidia cards you can enable AA via drivers for BF2.
 
Jun 14, 2003
10,442
0
0
while i respect your opinion, i guess you do value other things in IQ....beauty is always in the eye of the beholder and what looks good to one person may look like a dogs ass to someone else...so all the people saying "You think the IQ is better which is proven to be wrong." should really shut the hell up

i mean granted Nv have made some great steps in AA speed and quality and Transparency AA is an awesome feature to use too, but traditionally Ati is where its at when it comes to faster AA....i think ATI and NV's AA quality is on a par, but ATi got the slight lead in speed.

also, have you tried your X1800's HQ AF setting? im not joking when i say the difference is night and day between that and Nvidias filtering quality....it really is a nice feature.

adaptive AA on ATI is very much the same deal as transparency AA..... http://www.bit-tech.net/hardware/2006/0...nsparency_adaptive_aa_explained/1.html

read that it might clear some things up


but i think the crux of your problem was buying an X1800. for me the card just wasnt anything special at all and i dont think theres much longevity in it. it was late, just as fast as the competition, very hot, noisey and was in short supply.....just when things picked up it got super seeded by x1900

what you should of bought was the x1900.....that was the card that ATI should of released when it came to the x1800
 
Jun 14, 2003
10,442
0
0
Originally posted by: Zstream
Originally posted by: Airjarhead
A brief History:
I have been an ATI fan for a long time.
I have also loathed Nvidia for:
1) putting 3dfx out of business
2) Their inferior IQ (in the past)
3) Their marketing stategies

These are the video card I've owned since my beloved Voodoo5:
Geforce 3 (for about a week)
KyroII
Radeon 8500 (for about a 2 months before they had good drivers)
Geforce TI-4400
9700Pro (My reason for loving ATI and their AA that was just as good as 3dfx)
9800Pro
X800XL
X850XT
7800GT (for 2 weeks just as an experiment)
X1800XT 512MB

I have to admit, he 7800GT really surprised me. Nvidia has really improved their IQ, and their AA! I thought BF2 looked MUCH better on the 7800Gt than on my X850XT. The 7800s AA in BF2 was head and shoulders above ATIs (especially at long distances). For the most part the 7800GT was faster too. However, the 7800GT experienced a lot more slow downs in BF2 than the X850XT (especially in smoke).
So, when the 7900GT came out, I researched A LOT to decide between the X1800XT (for $289 for the 256MB or $339 for the 512MB), or one of the 7900GTs. I can't tell you how many times I've had one of every kind in my shopping cart. I'm actually killing myself with regret by not getting the EVGA CO for $299 at Newegg that I had in my cart on luanch day. Anyway, from all of the benchmarks I decided the X1800XT 512MB was the way to go.
I have to say, I'm not impressed at all with the X1800XT. The IQ is the exact same as the X850XT, and while it is faster, it doesn't impress me. I play BF2 at 12x9 with 4xAA now vice 10x7 with 4xAA (with the X850XT), but it doesn't look that much better. I really think the 7800GT looked better. I'm also very dissapointed with all the crap you have to go through just to play a damn game on the X1800XT. I have ATI tool to adjust the fan speed (so the card doesn't overheat in games), and to OC (and adjust the voltage for games). Then I use ATI Tray tools to adjust settings (AA, AF, etc). What the hell is going on? Is this what things have come to? Am I asking too much by wanting to boot up and just click on a game .exe and play? Don't even get me started on CCC - IT SUXXXXXX! Don't even try to say it's any good at all. It SUX, and ATI should be ashamed that they ever made it, and tried to make us use it. It takes up about 10MB of ram, and how much space (including the POS .net that's REQUIRED).

That's my opinion. If I get flamed, I get flamed.


So now, I'm thinking about the 7900GT. The other games I play are NHL06, Lock-On, and Falcon 4 AF - all of which seem to get better performance with Nvidia cards.
I am worried that the 2nd and 3rd round of 7900Gts won't overclock as well as the first round (especially the memory).
I've read the threads about people going from Nvidia to ATI, and saying the IQ is better on the ATI, but I just don't see it. Maybe we value different things.
I will say this AA quality and performance is one of THE most important things to me. I don't even look at benchmarks without AA.


Thanks,

So you card overheats in games but it doesnt? You think that the AA is better on the 7900GT is is proven to be wrong. You think the IQ is better which is proven to be wrong. You think 10MB of ram is to much when you are a complete idiot because I bet you have the windows XP crazy start bar and icon shadows. I am sorry you are disappointed in your purchase, I am sorry you are not smart, I am sorry that you scream like a girl on a forum.


10mb is just the ccc thing, theres all the other crap like .net that has to be running as well...so the total memory usage is going to be alot higher.

granted DDR is cheap as chips and theres no reason really that a gaming enthusiast shouldnt have 2gb in his machine
 

BassBomb

Diamond Member
Nov 25, 2005
8,390
1
81
oh yea, dont choose a horrible game to look at image quality... ati is noticably worse in bf2 quality (just look at teh shadows)

try playing a hardcore game (NOT BATTLEFIELD2) to test IQ
 

slatr

Senior member
May 28, 2001
957
2
81
Both companies make good cards but you made a good point

Lock-On, and Falcon 4 AF

Flight sims, particulary these do better with Nvidia cards for some reason

Also, I use some professional apps that require Open GL and Nvidia has a better Open GL implementation

 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: Airjarhead
The more I think about it...
I think the bad AA in BF2 IS due to the fact that we can't enable it in drivers (at least I have to enable it in-game), so we are using the built in Dice AA vice ATI's AA. However with Nvidia cards you can enable AA via drivers for BF2.

The reason you cannot set it in the CCC is because you have Cat AI enabled, please disable Catalyst AI and tell us the results.

open up cc, under 3d. Disable cat AI. this is the cause, BF2 does not accept any AA or AF settings in the game if you have it on.
 
Jun 14, 2003
10,442
0
0
Originally posted by: Zstream
Originally posted by: Airjarhead
The more I think about it...
I think the bad AA in BF2 IS due to the fact that we can't enable it in drivers (at least I have to enable it in-game), so we are using the built in Dice AA vice ATI's AA. However with Nvidia cards you can enable AA via drivers for BF2.

The reason you cannot set it in the CCC is because you have Cat AI enabled, please disable Catalyst AI and tell us the results.


yes you should be able to set AA and AF levels in the drivers at any time, so no matter what is set in the game, it'll do what you told it to in the drivers

also...its not DICE AA, all that happens when you set AA in the game, is that the game tells the graphics driver to apply X amount of AA, and the driver tells the card

all your doing by setting it straight in the drivers from the start is missing out that first stage, since the drivers will not be listening to what the game says.

though i found something peculiar a while back, setting AA in game (CS:S stress test) was actually about 8fps slower than setting the same AA in the control panel
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Yes, that is how it should work but in BFII it does not work that way with cat AI on. I have no idea why but I turn it off and make profiles in ati tools.
 

Elfear

Diamond Member
May 30, 2004
7,164
821
126
Originally posted by: Airjarhead

I'm sorry, but the X1800XT DOES have two different modes of operation (at least mine does). 2D and 3D. It is clocked at 594/693 @ 1.2V in 2d. Then when you start a 3d app, it is supposed to switch to 3D mode (625/750 at 1.4V). In order to OC with ATItool (and change the fan settings), I had to disable the service that does this switch. Trust me, If I didn't have to change the voltages, I wouldn't. I read it in a bunch of forums after I installed the card and had problems.
My Case airflow is excellent. My GPUs never overheated in the past and my cpu idles in the high 20s.
As for aftermarket cooling, I used an Arctic Silencer 5 on my X850XT, but nothing by Arctic fits on the X1800XT. The Accelero is NOT an option as I have a DFI mobo, and it will blow hot air on my chipset. I have ordered an NV silencer 5 that can fit both the X1800XT AND 7900Gt with some modification.

I think you're missing my point. I realize that if you want to oc with ATI Tool you have to disable the ATI Smart and ATI Hotkey services (which effectively disables the 2D/3D clock speed switch). Your OP made it sound like you had to mess with voltages, no matter what, just to play games. My point was that even though it takes a little more effort initially to oc your ATI card vs. an Nvidia card, be grateful you have the option to adjust voltages through software which the Nvidia cards do not.

Like munky said, just create a profile in ATI Tool for the voltages and clock speeds you want and than you don't have to mess with it everytime you want to game.

You still haven't answered the question about gpu temps. How hot is that thing getting?
 

imported_Airjarhead

Senior member
Dec 29, 2004
485
0
0
Thanks guys, the last few posts have been very useful.
Yes, I agree that it's not just CCC using resourses, and .net does too. That's why I said at leats 10MB.
I don't have CCC installed anymore so I can't enable HQ AF. I just noticed ATI Tray Tools does have that option, so I'll try it.
I do have Catalyst AI off, and I still can't seem to get AA to work through the drivers.
 

imported_Airjarhead

Senior member
Dec 29, 2004
485
0
0
The reason you cannot set it in the CCC is because you have Cat AI enabled, please disable Catalyst AI and tell us the results.

open up cc, under 3d. Disable cat AI. this is the cause, BF2 does not accept any AA or AF settings in the game if you have it on.


Originally posted by: Zstream
Yes, that is how it should work but in BFII it does not work that way with cat AI on. I have no idea why but I turn it off and make profiles in ati tools.

You SOB! Thanks!
I made sure AI was disabled, then tried enabling AA in ATT, but it didn't work.
THEN...
I deleted the cached shader folder in the BF2 folder in my docs, and rebooted...
Then it worked!
And guess what? AA was MUCH better!
I'm so glad I posted here - it was worth being called stupid;)