thilanliyan
Lifer
- Jun 21, 2005
- 12,065
- 2,278
- 126
Well, I can tell you this much... HIS definitely sucks more (way more) than eVGA, which are the two companies I've dealt with. That being said, I think there are more 7900GT hardware failures in general than there are X1900XT(X) failures.Someone should ask nitromullet which company sucks more, he's had bad luck with both ATI and Nvidia and could probably tell ya.
That's quite a stretch IMO. Personally, I think that an X1900XT Crossfire rig is probably the overall fastest dual card rig you can build, and is the undisputed 8xAA and up king, which makes it the IQ king as well. However, an X1900XT + X1900 CF + Crossfire motherboard isn't really what I would call a good bang for the buck setup. First of all, the two cards are pretty expensive in their own right. Then you have to consider the motherboards, which tend to be more pricy then SLI boards with similar features, and have a much smaller selection to choose from. IMO, 7900GT SLI or the 7950GX2 are a better bang for the buck than an X1900XT Crossfire rig. That being said, you get what you pay for...And i have no idea why he was saying it was the best bang for buck, when clearly in that area the X1900XTX Xfire set up is best bang for buck.
Subpar compared to what? For there to be subpar, there has to be par first. Seeing as how there is no card out that can do HDR+AA faster, it would appear that the Radeon sets the standard. Also, based on the 8xAA and 14xAA/16xAA numbers in xbit's review of the 7950GX2 and the absolute unbridled ass beating the Crossfire rig gives NV's finest at higher AA levels (weren't you complaining about ATI's AA earlier this week?) words like "subpar" should probably be used sparingly when describing Crossfire.if you go x1900 crossfire you are still going to get subpar performance, and only get HDR+AA in one currently popular game which is Oblivion. And that even requires a 3rd party hack to get working.
Originally posted by: GundamSonicZeroX
Gee thanx for the good luck. And the cards were tested on many many systems (I work at a locally owned PC repair shop) but like my relationship with AMD "THe competition will have to put out a product that really kick some ass before I switch over."
Originally posted by: GundamSonicZeroX
I wish I knew the release date of that card though![]()
Originally posted by: Lonyo
Originally posted by: GundamSonicZeroX
Gee thanx for the good luck. And the cards were tested on many many systems (I work at a locally owned PC repair shop) but like my relationship with AMD "THe competition will have to put out a product that really kick some ass before I switch over."
Make sure you don't buy an EVGA card then. Or BFG or XFX.
Originally posted by: munky
Originally posted by: Nvidiot
the GX2 is a killer deal for bang/buck and the implementation is very slick.
if you go x1900 crossfire you are still going to get subpar performance, and only get HDR+AA in one currently popular game which is Oblivion. And that even requires a 3rd party hack to get working.
If you appreciate Nvidia software support and hardware, I'd get yourself a GX2 and when they enable quadSLI you will have some true stomping power.
You could buy two X1900XTs, a Crossfire motherboard and use that dongle to connect the very loud and noisy, high power consumption video cards and still get beat by 10FPS in FEAR@[url="http://www.legitreviews.com/article/354/8/"]19x12[/url] or just get yourself a GX2 and enjoy life.
They say two GX2s in quadSLI would take less power than X1900 Crossfire!
Waiting for G80 isnt a bad idea either. But the reality on the HDR+AA game situation is harsh, and current hardware besides the GX2 isnt fast enough to push it at high resolutions.
You forgot to mention that, according to you, in the only game that supports HDR+AA, the gx2 still loses to a single XTX. What were you saying again about current hardware and HDR+AA?
And, the gx2 is just about as loud, hot and powerhungry as a XTX. So, does that mean you're now gonna bash it too for sounding like a jet engine, melting your desk, and requiring its own powerplant, or have you now shifted that definition to dual x1900's now that the green team isn't running so cool anymore?
Originally posted by: nitromullet
Subpar compared to what? For there to be subpar, there has to be par first. Seeing as how there is no card out that can do HDR+AA faster, it would appear that the Radeon sets the standard. Also, based on the 8xAA and 14xAA/16xAA numbers in xbit's review of the 7950GX2 and the absolute unbridled ass beating the Crossfire rig gives NV's finest at higher AA levels (weren't you complaining about ATI's AA earlier this week?) words like "subpar" should probably be used sparingly when describing Crossfire.
Originally posted by: Crusader
Theres no point to ATIs HDR+AA with only one attractive game and not enough horsepower to push it at 16x10 or 19x12.
See above reply with benchmarks showing the X1900s extreme power consumption, lackluster HDR+AA performance and deafening noise output.
Originally posted by: GundamSonicZeroX
Well, it is 2 cards in one.
Originally posted by: GundamSonicZeroX
Oh well, I'll just wait for G8s then.
Originally posted by: GundamSonicZeroX
I don't really like ATI.
Originally posted by: GundamSonicZeroX
Thanx fot the thread hijacking. Anyway, I always hated ATI ever since I got 3 DOA 9200SEs.
Originally posted by: GundamSonicZeroX
I have an Nvidia board anyway. I won't be able to use crossfire.
Originally posted by: GundamSonicZeroX
ATI branded cards. And THEN a DOS from Xtasy. ATI has to really PWN Nvidia before I change sides.
Originally posted by: GundamSonicZeroX
My loss? More like my gain. I don;t want any more burnt iut cards!
Originally posted by: GundamSonicZeroX
I wish I knew the release date of that card though![]()
Originally posted by: GundamSonicZeroX
Gee thanx for the good luck. And the cards were tested on many many systems (I work at a locally owned PC repair shop) but like my relationship with AMD "THe competition will have to put out a product that really kick some ass before I switch over."
Originally posted by: GundamSonicZeroX
Originally posted by: Lonyo
Originally posted by: GundamSonicZeroX
Gee thanx for the good luck. And the cards were tested on many many systems (I work at a locally owned PC repair shop) but like my relationship with AMD "THe competition will have to put out a product that really kick some ass before I switch over."
Make sure you don't buy an EVGA card then. Or BFG or XFX.
I only buy Gigabyte or eVGA cards.
Where is he asking about it? Truth is, he's waiting for G80 like a lot of us.Originally posted by: Crusader
Im not recommending a different card. Im suggesting he get the 7950 hes asking about
because its an industry leading product that is simply unmatched and unanswered by ATI.
Theres no point to ATIs HDR+AA with only one attractive game and not enough horsepower to push it at 16x10 or 19x12.
See above reply with benchmarks showing the X1900s extreme power consumption, lackluster HDR+AA performance and deafening noise output.
Originally posted by: akugami
Originally posted by: Rangoric
Originally posted by: fierydemise
So you had some really bad luck and now you limit yourself to one company who may or may not be best.Originally posted by: GundamSonicZeroX
Thanx fot the thread hijacking. Anyway, I always hated ATI ever since I got 3 DOA 9200SEs.
Logic FTW
I dunno sounds reasonable.
How many times must a company burn you before you won't use them anymore?
Not saying he's in the right about ATI but seriously that was a pretty good reason for him not to bother with them.
I had 3, yes 3, DOA Seagate laptop HD's. Last one worked great and is running fine. Musta been a glitch with a bad batch but I still buy Seagate HD's and have had zero problems with any other Seagate HD I own. Manufactoring processes are not 100% perfect and even rigorous testing will have some faulty products slip through the cracks.
Having bought 7 or 8 ATI cards over the last few years for just systems in my house, and about 3 nVidia cards, I can say both ATI and nVidia are relatively decent quality control wise. And on the flip side, EVGA had problems with their initial runs of 7900GT's, does that mean EVGA (one of the better video card companies) suck? Should everyone now stay away from EVGA hardware?
Originally posted by: GundamSonicZeroX
I have an Nvidia board anyway. I won't be able to use crossfire.
You have guys arguing "bang for the buck" here and mentioning SLI, GX2's, Crossfire. News Flash: They all suck when considering bang for the buck. Why? For twice the cost (assuming you already have a compatible motherboard), you do not get twice the performance. My personal advice if you were upgrading now is, 7900GT if you want to take the risk of volt modding and overclocking your card to heck, X1800XT 512MB if you can find those at a good deal and doing the same or X1900XT if you don't want to overclock. The 7900GTX performs about on par in most things with the X1900XT but costs more so it's not even considered.
Originally posted by: GundamSonicZeroX
My loss? More like my gain. I don;t want any more burnt iut cards!
What gain? You're limiting yourself to products from only half of the industry. Why would it be your gain? I understand you had bad experiences with some ATI cards but even nVidia has had bad runs as evidenced by the recent EVGA, XFS woes. Which they already solved but tell that to all the 7900GT buyers who had to wait in line for an RMA card. Yes, they had so many bad nVidia cards that people had to wait in a queue for their cards. Not trying to scare you on nVidia hardware, the long wait for RMA cards was compounded by high initial demand on the 7900GT cards as well as, in EVGA's case, their step-up program. I'm not trying to push ATI hardware on you, just saying to keep an open mind. Ultimately the decision on what to buy is yours and if you're happy with it, who am I to argue with you that your decision is wrong.
Originally posted by: GundamSonicZeroX
I wish I knew the release date of that card though![]()
On one of the rumor sites but someone said they saw an ATI roadmap up to Nov and the R600 core was nowhere in sight. Usually nVidia and ATI releases hardware with a month of each other unless a manufactoring or other glitch comes up as was the case in the X1800 launch. There are refresh cards from ATI and nVidia coming up in late summer I believe which doesn't bode well for a quick release of the next series of cards, figure at least 3 months after the refresh release. So...Nov at the earliest, and quite possibly not until early 2007 to coincide with the launch of Windows Vista since both the G80 and R600 are touting DX10 support.
With one X1900XTX it's only powerful enough to run 12X10 resolution in Oblivion w/HDR+AA, and very few people with X1900XTXs have monitors that are not LCDs with 16X10+ resolution.
So they have the choice of interpolating the the resolution down, which causes distortion, or not using part of their screen, which causes anger.
See above reply for more examples of subpar X1900 performance.
Originally posted by: redbox
Ok Josh cool it the op put in the topic heading Can a GeForce 7950 do HDR+AA. We all said it cannot. The op then said oh well I will just wait for the G80. The op clearly likes Nvidia stuff better so why don't we help him with that. Crusader pretty much said don't wait for G80 because it looks to be a ways off go with the 7950GX2, which is the most powerful card right now. The only one really pushing cards the op won't care about are you and the rest of the ATI fans.
I do like the x1900xt/xtx but that doesn't mean I am going to shove it down everyones throat like you seam to want to do. The fact is the op probably will never switch to ATI untill an Nvidia card or cards let him down. So instead of fighting an uphill battle try and point him in the best direction.
To the OP:
I usually don't agree with crusader but for this op the 7950GX2 would be a good card to enjoy for about 6 or so months untill the G80 is here. It will have speed like you won't believe but it will look just like any other 7series card.
If you want next gen hardware that is ATI right now, but if you would rather lick the wet crack of Roseanne's rear end...
then I would have to tell you to wait it out untill the G80. If you can't wait for about 6 months then go with the 7950GX2 I hope this helps.
Originally posted by: secretanchitman
only reason i prefer nvidia over ati is their superior drivers. i absolutely cannot stand the bloated ati catalyst drivers...yes, i know there is ati tray tools or the omega drivers and whatnot, but those are extra stuff i have to install, unlike nvidias where its just 1 driver for every card (minus the TNT2, geforce 2, etc) and their drivers are clean, streamlined and easy to use.
thats why.
Originally posted by: munky
Originally posted by: Nvidiot
the GX2 is a killer deal for bang/buck and the implementation is very slick.
if you go x1900 crossfire you are still going to get subpar performance, and only get HDR+AA in one currently popular game which is Oblivion. And that even requires a 3rd party hack to get working.
If you appreciate Nvidia software support and hardware, I'd get yourself a GX2 and when they enable quadSLI you will have some true stomping power.
You could buy two X1900XTs, a Crossfire motherboard and use that dongle to connect the very loud and noisy, high power consumption video cards and still get beat by 10FPS in FEAR@[url="http://www.legitreviews.com/article/354/8/"]19x12[/url] or just get yourself a GX2 and enjoy life.
They say two GX2s in quadSLI would take less power than X1900 Crossfire!
Waiting for G80 isnt a bad idea either. But the reality on the HDR+AA game situation is harsh, and current hardware besides the GX2 isnt fast enough to push it at high resolutions.
You forgot to mention that, according to you, in the only game that supports HDR+AA, the gx2 still loses to a single XTX. What were you saying again about current hardware and HDR+AA?
And, the gx2 is just about as loud, hot and powerhungry as a XTX. So, does that mean you're now gonna bash it too for sounding like a jet engine, melting your desk, and requiring its own powerplant, or have you now shifted that definition to dual x1900's now that the green team isn't running so cool anymore?
Originally posted by: hemmy
Originally posted by: munky
Originally posted by: Nvidiot
the GX2 is a killer deal for bang/buck and the implementation is very slick.
if you go x1900 crossfire you are still going to get subpar performance, and only get HDR+AA in one currently popular game which is Oblivion. And that even requires a 3rd party hack to get working.
If you appreciate Nvidia software support and hardware, I'd get yourself a GX2 and when they enable quadSLI you will have some true stomping power.
You could buy two X1900XTs, a Crossfire motherboard and use that dongle to connect the very loud and noisy, high power consumption video cards and still get beat by 10FPS in FEAR@[url="http://www.legitreviews.com/article/354/8/"]19x12[/url] or just get yourself a GX2 and enjoy life.
They say two GX2s in quadSLI would take less power than X1900 Crossfire!
Waiting for G80 isnt a bad idea either. But the reality on the HDR+AA game situation is harsh, and current hardware besides the GX2 isnt fast enough to push it at high resolutions.
You forgot to mention that, according to you, in the only game that supports HDR+AA, the gx2 still loses to a single XTX. What were you saying again about current hardware and HDR+AA?
And, the gx2 is just about as loud, hot and powerhungry as a XTX. So, does that mean you're now gonna bash it too for sounding like a jet engine, melting your desk, and requiring its own powerplant, or have you now shifted that definition to dual x1900's now that the green team isn't running so cool anymore?
http://www.xbitlabs.com/articles/video/display/nvidia-gf7950gx2_15.html
