X1900xt vs 7900GTX

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
OP, the fanboys are going to be debating for a while and it's not going to get you anywhere but confused. You said you play alot of oblivion, oblivion is faster on ATI cards and you can do HDR+AA on the X1900XT which provides better IQ. HQAF is just a higher quality method of AF (texture filtering). AA quality is subjective but nVidia's Transparency AA does a better job then ATI's (ATI's has a tendency to be a little over zealous). The performance is roughly equal but the X1900XT takes less of a hit with AA. Heat and noise are both issues with the X1900XT, it runs hot so the cooler is loud however I have not heard of any heat issues with the X1900XT with the stock cooler running at stock but again if noise is a concern you'll have to buy an aftermarket cooler or go with the GTX. On drivers, ATI's control panel Catalyst Control Center(CCC) is bloated and adds time (around 10 seconds) to start up but CCC can be replaced by ATI Tool Tray which is much less resource intensive. I don't know that much about nVidia's drivers but I have heard complaints about the new control panel being both bloated and hard to use.

Here's the $325 X1900XT
EDIT: Sorry OP didn't catch that you were overseas and couldn't use newegg.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Ackmed
Once again, you post your opinion as facts in most of these, when its far from that.
Just because my statements don't follow your bias, does not make them false. :roll:

And no I dont do that, stop lying about it. Read my posts, and show me once where I do that.
You just stated in another post that HL2 HDR is "watered down". Pure opinion as another poster preferred it and since it's supported by NVIDIA you had to use your "subjective" view to bash it. I personally think it looks better than Farcry's but that would be my "opinion".
What I did, was refute some claims made by people as facts, when they are not. What I did, was tell the OP to look at the games he plays now, and will play to get a better idea on which is "better" for him. You posted opinionated blanket statements. All pro for NV, nothing negative. Imagine that.
Your posts are nothing different from mine, yours are all "pro ATI". Part of a forum is having a different view, I'm not sure why this upsets you sooo much. Try posting in Rage3D's forum where your slanted view will be better tolerated.
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: Wreckage
Originally posted by: Ackmed
Once again, you post your opinion as facts in most of these, when its far from that.
Just because my statements don't follow your bias, does not make them false. :roll:

I never called them false. I simply said they are an opinion, and you presented them as facts. Some are 100% true, some are just opinion. Same as for the other guy. He made it a point to show that two of his points where "subjective", when in fact, far more were. And yet again, you show only good for NV, and nothing negative.

And no I dont do that, stop lying about it. Read my posts, and show me once where I do that.
You just stated in another post that HL2 HDR is "watered down". Pure opinion as another poster preferred it and since it's supported by NVIDIA you had to use your "subjective" view to bash it. I personally think it looks better than Farcry's but that would be my "opinion". [/quote]

It is watererd down. Its not at the same level. There are key differences between the two. That is a fact. What is up for discussion, is what looks better. I didnt claim that Farrys did, I claimed that HL2 is watered down, which is true.

What I did, was refute some claims made by people as facts, when they are not. What I did, was tell the OP to look at the games he plays now, and will play to get a better idea on which is "better" for him. You posted opinionated blanket statements. All pro for NV, nothing negative. Imagine that.
Your posts are nothing different from mine, yours are all "pro ATI". Part of a forum is having a different view, I'm not sure why this upsets you sooo much. Try posting in Rage3D's forum where your slanted view will be better tolerated.
[/quote]

I can admit I like ATi better, and do not try to hide it. I dont care if you like NV better, its not like you use these highend cards you talk about anyways. I responded to your post, because you once again, try to show your opinion as fact, when its far from that. Saying NV has better AA, is an opinion. Saying they have better drivers is an opinion. Why is it so hard for you to start out with, "In my opinion..." and then follow up with your statements? You bring up what you see as good points for NV, and nothing negative. Then only bad points for ATi, and nothing positive. I dont have anything against you, I just dont understand why you try to pass off your opinion, as fact every single time.

Then you try to use a benchmark from Rage3D, and dont even know the facts about the benchmark. ATi was running "HQ", NV was running at Quality. Not the same thing.

edit, fierydemise he is overseas, cannot use newegg. Apparantly the difference between the XT and GTX where he is at, is $50.
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
Well thanks to everyone for their opinions and the rather interesting flame war hehe.

So here is my decision.

The X1900XTs are extremely nice but according to all the user reviews at newegg, they make a lot of noise especially later in their careers. Even users here have said that as well. Plus I can't find one that gives a warranty of more than a year, which sucks for resale.

Also they seem to be poor for overclocking, as they have been released pretty close to their potential.

My experience of the 9800 ruined ATi for me. I loved ATi. I was their biggest fanboy as a teenager. I have been using ATi since the days Rage Pro, until I moved to the 7500 and then the magnificent 8500, after which came the 9700 and then finally the 9800.

The 9800 was a nightmare. RMA'ed twice and the drivers were poor. I had to swap drivers between games like KOTOR and NFS etc. Like a friend once said, for a card that owned a generation, it didn't feel right.

But on the top side, X1900XTs are fantastic for Oblivion and have a higher min. frame rate than the 7900s. Plus I was getting it cheaper. The issue came down to warranty though.

The 7900s all come with lifetime warranties which is awesome for resale. Where I live only NVIDIA is popular, which explains how I sold it for so much.

The dude who said SLI is more mature is correct. It is more mature. A lot of people are still having issues with crossfire, while SLI is just easier to implement. Plus it makes sure your card has longer value. There is always bound to be someone looking for the exact same card you have to complete his SLI.

Plus I was reading up on the new conroe mobos, and the budget ones have ditched crossfire support. They are only going for SLI versions with those.

So even though I was going to enjoy the X1900XT more, I had decided on the 7900GTX.

Until I went to newegg and read that an above average percentage of buyers were getting artifacts.

Then this scared me:

http://enthusiast.hardocp.com/article.html?art=MTA2OSwxLCxoZW50aHVzaWFzdA==

Yes there is a lifetime warranty, but it is still a hassle.

So I am probably going to get a BFG 7900GTX...

Or...

I am going to get a cheap graphics card to get a display, and wait a month or less till the G80 comes with DX10 support.



 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Waiting for G80 might not be a bad idea, it could be a pretty awesome card if it manages to do the 100% more or less over the previous gen thing. I mean, a GTX with a good OC beats 2 6800 ultra (ie, 7900gtx of this generation)

Btw, IMO not going with ATI because of some card that came out years ago is a dumb way to make a decision
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: Frackal
Waiting for G80 might not be a bad idea, it could be a pretty awesome card if it manages to do the 100% more or less over the previous gen thing. I mean, a GTX with a good OC beats 2 6800 ultra (ie, 7900gtx of this generation)

Btw, IMO not going with ATI because of some card that came out years ago is a dumb way to make a decision

It is his decision. This is the issue, when a person gets burned on something it might be sometime before he tries something from that company again.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Sure, sure, but I would go by current realities rather than a single isolated incident years ago
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Perfectly good logic for your decision, and you'll be happy with the GTX. A few points though:
My experience of the 9800 ruined ATi for me. I loved ATi. I was their biggest fanboy as a teenager. I have been using ATi since the days Rage Pro, until I moved to the 7500 and then the magnificent 8500, after which came the 9700 and then finally the 9800.

The 9800 was a nightmare. RMA'ed twice and the drivers were poor. I had to swap drivers between games like KOTOR and NFS etc. Like a friend once said, for a card that owned a generation, it didn't feel right
ATI drivers (for Windows) are way better than they used to be. The CCC + .Net combo is hog, but there are many third party tools to get around them. That being said, the drivers are very stable and the CCC is very functional.
Plus I was reading up on the new conroe mobos, and the budget ones have ditched crossfire support. They are only going for SLI versions with those.
The "mainstream" Intel P965 Epress chipset only supports dual physical PCIe x16 slots in an x16/x4 config, not true dual PCIe x16. Neither Crossfire nor SLI will work on the P965 boards. Intel's P975 Expess chipset will support Crossfire, but not SLI. SLI support will come from nForce590 SLI for Intel, it is debatable whether or not that chipset will be any cheaper. I have also heard that there will be an R600 Crossfire chipset coming from ATI that will support Conroe as well.

Again, not finding any flaw in your conclusion, I just think you should be working from current, correct info. :)
 

ss284

Diamond Member
Oct 9, 1999
3,534
0
0
Originally posted by: Ackmed
Who said Im worked up? It is subjective, the cards use power differently. The XT using more, doesnt mean is not as efficient. If both the GTX and XT had the same "engine", it would. Yet they do not, they are vastly different. What does it really matter anyways. You should be running a PCIe PSU anyways, and its very unlikely the extra power needed would be a problem.

I said "mature", because you did."More mature multi-card support " is what you said. And I said thats subjective.

Its not subjective. The cards don't use power differently. They both use it to switch transistors between 1's and 0's. It doesnt matter what engine they use, the power is still used to power transistors. The fact is, in the same system, an XTX will use 60 more watts of power than a GTX at load. Its also a fact that the XTX leaks more power than the GTX, thus resulting in its much higher heat output, and a much much louder fan. Power efficiency wise, the XTX trails the GTX by a mile, much like how the p4 prescott cores trailed the athlon 64 core.

And again, the maturity of Nvidia's SLI solution isnt opinion, its fact. There are more game profiles, and thus a much higher compatibility with the games currently out. Its been out longer, and its more widely used, resulting in more developers integrating support for it within their games. These are all facts, not opinions.

On the other hand, I would have to say the 1900 series is faster when playing with higher settings in the majority of DX games, as well as the ability to play with FSAA and HDR. You pay the price with heat and noise, as well as Warranty/Support(No company selling 1900's can come close to the American based service of EVGA and BFG).
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: nitromullet
Perfectly good logic for your decision, and you'll be happy with the GTX. A few points though:
My experience of the 9800 ruined ATi for me. I loved ATi. I was their biggest fanboy as a teenager. I have been using ATi since the days Rage Pro, until I moved to the 7500 and then the magnificent 8500, after which came the 9700 and then finally the 9800.

The 9800 was a nightmare. RMA'ed twice and the drivers were poor. I had to swap drivers between games like KOTOR and NFS etc. Like a friend once said, for a card that owned a generation, it didn't feel right
ATI drivers (for Windows) are way better than they used to be. The CCC + .Net combo is hog, but there are many third party tools to get around them. That being said, the drivers are very stable and the CCC is very functional.
Plus I was reading up on the new conroe mobos, and the budget ones have ditched crossfire support. They are only going for SLI versions with those.
The "mainstream" Intel P965 Epress chipset only supports dual physical PCIe x16 slots in an x16/x4 config, not true dual PCIe x16. Neither Crossfire nor SLI will work on the P965 boards. Intel's P975 Expess chipset will support Crossfire, but not SLI. SLI support will come from nForce590 SLI for Intel, it is debatable whether or not that chipset will be any cheaper. I have also heard that there will be an R600 Crossfire chipset coming from ATI that will support Conroe as well.

Again, not finding any flaw in your conclusion, I just think you should be working from current, correct info. :)

:thumbsup:

I agree with nitro here. If that is what he wanted, good, I'm glad he found what he wanted. There are pros and cons to both sides, but not the same ones that were existent in the past so it is good to come to terms with how things are right now.
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Just a couple things I'd like to add. Looks like you've made your choice and I'm sure you'll be happy with it.

Originally posted by: anandtechrocks
but I find that ATI's high heat, excessive power usage, and loud noise to be more of a deciding factor than HDR+AA or "higher quality AF."

Why would "high heat" by itself be a negative unless you've decided to do ice carvings behind your rig. As others have mentioned, the X1*** series cards exhaust it out the back while the 7900 series exhaust it inside the case.

Excessive power usage might be an issue but only if you're running a cheap PSU on the ragged edge in the first place, in which case the ATI card would overpower it before the Nvidia counterpart. Most people who buy a high-end card probably have a decent PSU so I imagine that it would effect a small # of consumers.

As far as extra cost per month, how much do you think an extra 60W will cost you? With the cost of electricity at ~$0.08 per KWh and if you assume that the OP games (remember the cards are fairly equal at idle, it's only at load that you'll see the "big" difference) two hours per day than your monthly bill will go up ~$0.28. Do you really think that would be a deciding factor between the two cards?

One more thing. Overclocking is always a hit or a miss, but with nVidia's current line of "cooler" GPUs it's definately a much better option than ATI.

Most X1900XT/XTX cards I've seen will easily reach 690/800 with ATI's Overdrive. Most will go higher but that's where most people decided to stop as it becomes slightly (very slightly IMO) more complicated without Overdrive. With an aftermarket cooler (which could easily be fit into the money you'd save between the two cards) 700-720 is pretty easy to reach on the core. What will the majority of 7900GTX's do?

If you consider anything more than low-level overclocking, than the X1900XT has the 7900GTX beat hands down IMO. Software voltage adjustments are SOOOOO much easier than hardware voltmods. I say this from experience having voltmodded 7800GTs and my current card.

Originally posted by: Pugnate

The X1900XTs are extremely nice but according to all the user reviews at newegg, they make a lot of noise especially later in their careers. Even users here have said that as well.

Unless you are extremely sensitive to noise or live somewhere with very high ambient temps, than the noise really isn't bad at all as the card only goes to 100% fan usage for a few seconds at startup and generally doesn't go above 50% during gaming. It is louder than the 7900GTX for sure but it isn't the 3000db jet-engine-right-next-to-your-ear that some people make it out to be.

Plus I can't find one that gives a warranty of more than a year, which sucks for resale.

The only situation that this will matter in is if you buy an XFX card as they are the only ones who offer a double lifetime warranty (warranty covers life of original buyer and 2nd owner). Your warranty will be void the instant you sell the card to someone else if you buy from any other company so no one will care how long the warranty is good for.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: supastar1568
Originally posted by: Pugnate
It is a choice for me between the two? What shall I go for? And for god's sake, no one say 7950GTX!


well, how bout a 7950GX2?


That's what I would go with if I were going nvidia (or wait for G80)

Anyway, the point I brought up is as important if not more important than any other point made so far, IMO.

And that is:


When both cards are using high quality image settings, a GTX even at 700/1800 cannot compete ever, even in OpenGL, with an X1900XTX... other tests have shown that nvidia loses quite a bit at its high quality settings, whereas ATI is default at those settings, and tends to have better image quality to boot.

That's a big deal, because who buys a top end card to turn down image quality settings?


If a GTX at 700/1800 gets beaten by an average of 20% by an XTX when HQ settings are on in both cards, that's a big deal
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: josh6079
Originally posted by: nitromullet
Perfectly good logic for your decision, and you'll be happy with the GTX. A few points though:
My experience of the 9800 ruined ATi for me. I loved ATi. I was their biggest fanboy as a teenager. I have been using ATi since the days Rage Pro, until I moved to the 7500 and then the magnificent 8500, after which came the 9700 and then finally the 9800.

The 9800 was a nightmare. RMA'ed twice and the drivers were poor. I had to swap drivers between games like KOTOR and NFS etc. Like a friend once said, for a card that owned a generation, it didn't feel right
ATI drivers (for Windows) are way better than they used to be. The CCC + .Net combo is hog, but there are many third party tools to get around them. That being said, the drivers are very stable and the CCC is very functional.
Plus I was reading up on the new conroe mobos, and the budget ones have ditched crossfire support. They are only going for SLI versions with those.
The "mainstream" Intel P965 Epress chipset only supports dual physical PCIe x16 slots in an x16/x4 config, not true dual PCIe x16. Neither Crossfire nor SLI will work on the P965 boards. Intel's P975 Expess chipset will support Crossfire, but not SLI. SLI support will come from nForce590 SLI for Intel, it is debatable whether or not that chipset will be any cheaper. I have also heard that there will be an R600 Crossfire chipset coming from ATI that will support Conroe as well.

Again, not finding any flaw in your conclusion, I just think you should be working from current, correct info. :)

:thumbsup:

I agree with nitro here. If that is what he wanted, good, I'm glad he found what he wanted. There are pros and cons to both sides, but not the same ones that were existent in the past so it is good to come to terms with how things are right now.

Thanks for the vote of confidence. I have to make a correction though... I was erroneous in implying that the 975X Express chipset supported true dual x16 PCIe, when in fact it only supports Crossfire 8x/8x. This is a bit of a disappointment to me, since currently the only Crossfire/Conroe board I've seen for sale anywhere is the Asus P5W DH based on 975X, and it's usually around $250. IMO, that is way to high for an 8x/8x PCIe setup these days, I don't care what cpu it supports. So far, the Conroe motherboards aren't very exciting IMO. I'm sure that will change though.
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
It is I again.

Sorry I think I got incorrect info on the intel mobos then. And I've slept on it, and while not having a graphics card for a month or so is definitely depressing, I can live without gaming till the G80 is released.

Why would "high heat" by itself be a negative unless you've decided to do ice carvings behind your rig. As others have mentioned, the X1*** series cards exhaust it out the back while the 7900 series exhaust it inside the case.

A lot of people have asked a fair question. Why let the past bother you so much? Drivers are better now and heat isn't that big issue. I guess it would be like going to a foreign country, having one bad experience, and stereotyping the whole place. Ok that would be different. :p

For the record, I never had a problem with ATi drivers till the 9800. I think there were lots of issues then because the Xbox was powered by NVIDIA and games like KOTOR, Morrowind, Splinter cell etc were all being designed with the green in mind and not the red. So had to do a lot of driver swapping.

My experience with the 8500 was excellent. It was faster than the GeForce 4 series and a lot cheaper. All the Nvidia fanboys shouted drivers even though they were excellent.

But in the end, I am still a touch weary of ATi.

The heat was the reason my 9800 would restart so many times. It wasn't just me, my friends had the same issue as well. My brother to this day has the same issue. The ATi cards were pushed to their limits and thus it affected performance.

Would you believe I underclocked the card?

Look I know that things are different. But graphics card are a big industry now, and I just want to play my F.E.A.R., Oblivion and Prey without it crashing.

For the record, if I were in North America I'd buy ATi. Also to me ATi have definitely better image quality.

Their anti aliasing just looks as lot better to me, always has. Though the best was 3DFxs haha. Plus their DVD software is unparalleled.

But I am going to wait. Hey maybe a period without gaming will be good for me. :)



 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: nitromullet
So far, the Conroe motherboards aren't very exciting IMO. I'm sure that will change though.

It will when DFI comes out with one. :)
 

akshayt

Banned
Feb 13, 2004
2,227
0
0
1900XT, check the thread in video section of 1900XTX to 7900GTX oced.

also, performance wise compare a 1900XT to 7900GTX and buy a 1900XT.

price wise compare a 1900XTX to 7900GTX and buy a 1900XTX or 1900XT.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91

ss284

Diamond Member
Oct 9, 1999
3,534
0
0
Originally posted by: Elfear
Why would "high heat" by itself be a negative unless you've decided to do ice carvings behind your rig. As others have mentioned, the X1*** series cards exhaust it out the back while the 7900 series exhaust it inside the case.

Excessive power usage might be an issue but only if you're running a cheap PSU on the ragged edge in the first place, in which case the ATI card would overpower it before the Nvidia counterpart. Most people who buy a high-end card probably have a decent PSU so I imagine that it would effect a small # of consumers.

As far as extra cost per month, how much do you think an extra 60W will cost you? With the cost of electricity at ~$0.08 per KWh and if you assume that the OP games (remember the cards are fairly equal at idle, it's only at load that you'll see the "big" difference) two hours per day than your monthly bill will go up ~$0.28. Do you really think that would be a deciding factor between the two cards?

Unless you are extremely sensitive to noise or live somewhere with very high ambient temps, than the noise really isn't bad at all as the card only goes to 100% fan usage for a few seconds at startup and generally doesn't go above 50% during gaming. It is louder than the 7900GTX for sure but it isn't the 3000db jet-engine-right-next-to-your-ear that some people make it out to be.

High heat is a negative because thats extra heat in your house. Last I checked, extra heat in your house is bad during summer. 60 extra watts coming out of your computer within one room will make a noticable difference in the rate of temperature rise. While the extra cost of electricity from usage might be minimal, the electricity cost needed to cool that room back down isn't. Not to mention, with central air its difficult to cool one room in the house down without cooling the others.

As for the noise issue, I would have to say you are in the minority if you consider the noise really not that bad. The thing is insanely loud, and it only gets louder when you overclock. Even at stock load, with the fan at 50%, the card is noisy, and definitely audible in most systems, as many reviews have confirmed. It is a jet engine when compared to the 7900GTX's cooler, but thats necessary since it puts out so much more heat. Replacing the cooler with an Accelero voids the already short warranty, something you wouldnt have to do with the 7900GTX.
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
On other issue with heat is I live in a pretty hot climate. My 7800GTX used to run at 80C game time.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Sorry I think I got incorrect info on the intel mobos then. And I've slept on it, and while not having a graphics card for a month or so is definitely depressing, I can live without gaming till the G80 is released.

On other issue with heat is I live in a pretty hot climate. My 7800GTX used to run at 80C game time.

Just so you know, G80 and R600 are rumored to be hot and hungry.
 

ss284

Diamond Member
Oct 9, 1999
3,534
0
0
Originally posted by: Pugnate
On other issue with heat is I live in a pretty hot climate. My 7800GTX used to run at 80C game time.

If your 7800GTX ran at 80C then the x1900xt is going to run much hotter.
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
Just so you know, G80 and R600 are rumored to be hot and hungry.

If it is worth the pleasure then fine. But I read that is why the G80 wasn't released in July. It's power efficiency was being worked on.