Can a GeForce 7950

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
He will get a bad NV card sooner or later.

He must not frequent the boards much, as there are plenty of reports of bad cards from both sides.

Reminds me of the "XXX hard drives suck! I'm never buying another one!" threads.....

Many people would say it wasn't the cards anyway, when a person claims 3 or 4 bad ones in a row.....
 

GundamSonicZeroX

Platinum Member
Oct 6, 2005
2,100
0
0
Gee thanx for the good luck. And the cards were tested on many many systems (I work at a locally owned PC repair shop) but like my relationship with AMD "THe competition will have to put out a product that really kick some ass before I switch over."
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Someone should ask nitromullet which company sucks more, he's had bad luck with both ATI and Nvidia and could probably tell ya.
Well, I can tell you this much... HIS definitely sucks more (way more) than eVGA, which are the two companies I've dealt with. That being said, I think there are more 7900GT hardware failures in general than there are X1900XT(X) failures.

And i have no idea why he was saying it was the best bang for buck, when clearly in that area the X1900XTX Xfire set up is best bang for buck.
That's quite a stretch IMO. Personally, I think that an X1900XT Crossfire rig is probably the overall fastest dual card rig you can build, and is the undisputed 8xAA and up king, which makes it the IQ king as well. However, an X1900XT + X1900 CF + Crossfire motherboard isn't really what I would call a good bang for the buck setup. First of all, the two cards are pretty expensive in their own right. Then you have to consider the motherboards, which tend to be more pricy then SLI boards with similar features, and have a much smaller selection to choose from. IMO, 7900GT SLI or the 7950GX2 are a better bang for the buck than an X1900XT Crossfire rig. That being said, you get what you pay for...

if you go x1900 crossfire you are still going to get subpar performance, and only get HDR+AA in one currently popular game which is Oblivion. And that even requires a 3rd party hack to get working.
Subpar compared to what? For there to be subpar, there has to be par first. Seeing as how there is no card out that can do HDR+AA faster, it would appear that the Radeon sets the standard. Also, based on the 8xAA and 14xAA/16xAA numbers in xbit's review of the 7950GX2 and the absolute unbridled ass beating the Crossfire rig gives NV's finest at higher AA levels (weren't you complaining about ATI's AA earlier this week?) words like "subpar" should probably be used sparingly when describing Crossfire.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: GundamSonicZeroX
Gee thanx for the good luck. And the cards were tested on many many systems (I work at a locally owned PC repair shop) but like my relationship with AMD "THe competition will have to put out a product that really kick some ass before I switch over."

Make sure you don't buy an EVGA card then. Or BFG or XFX.
 

Sc4freak

Guest
Oct 22, 2004
953
0
0
Originally posted by: GundamSonicZeroX
I wish I knew the release date of that card though :(

Late this year to early next year. You'll be doing quite a bit of waiting...
 

GundamSonicZeroX

Platinum Member
Oct 6, 2005
2,100
0
0
Originally posted by: Lonyo
Originally posted by: GundamSonicZeroX
Gee thanx for the good luck. And the cards were tested on many many systems (I work at a locally owned PC repair shop) but like my relationship with AMD "THe competition will have to put out a product that really kick some ass before I switch over."

Make sure you don't buy an EVGA card then. Or BFG or XFX.

I only buy Gigabyte or eVGA cards.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: munky
Originally posted by: Nvidiot
the GX2 is a killer deal for bang/buck and the implementation is very slick.

if you go x1900 crossfire you are still going to get subpar performance, and only get HDR+AA in one currently popular game which is Oblivion. And that even requires a 3rd party hack to get working.

If you appreciate Nvidia software support and hardware, I'd get yourself a GX2 and when they enable quadSLI you will have some true stomping power.

You could buy two X1900XTs, a Crossfire motherboard and use that dongle to connect the very loud and noisy, high power consumption video cards and still get beat by 10FPS in FEAR@[url="http://www.legitreviews.com/article/354/8/"]19x12[/url] or just get yourself a GX2 and enjoy life.
They say two GX2s in quadSLI would take less power than X1900 Crossfire!

Waiting for G80 isnt a bad idea either. But the reality on the HDR+AA game situation is harsh, and current hardware besides the GX2 isnt fast enough to push it at high resolutions.

You forgot to mention that, according to you, in the only game that supports HDR+AA, the gx2 still loses to a single XTX. What were you saying again about current hardware and HDR+AA?
And, the gx2 is just about as loud, hot and powerhungry as a XTX. So, does that mean you're now gonna bash it too for sounding like a jet engine, melting your desk, and requiring its own powerplant, or have you now shifted that definition to dual x1900's now that the green team isn't running so cool anymore?

As far as Oblivion HDR goes, the GX2 still beats the XTX at 19X12, and absolutely owns at 20X15?
http://www.motherboards.org/reviews/hardware/1624_7.html

As far as power goes, less than a single XTX at load?
http://techreport.com/reviews/2006q2/geforce-7950-gx2/index.x?pg=8

GX2 Noise:
http://www.guru3d.com/article/Videocards/353/4/

"We startup a benchmark, we take the dBA meter, move away 75 CM and then aim the device at the active fan on the graphics card. We measure roughly 42-43 dBa, which is to be considered a quiet to moderate noise level coming from the PC. "

X1900XTX Crossfire Noise:
http://www.guru3d.com/article/Videocards/332/5/
"We startup a benchmark, we take the dBA meter, move away 75 CM and then aim the device at the active fan on the graphics card. We measure almost 58 dBa which is to be considered a moderate to noisy noise level coming from the PC. Again, this is a very subjective test, but hey .. that's a lot. "

X1900XTX Noise:
http://www.guru3d.com/article/Videocards/329/4/
"We startup a benchmark and leave it running for a while. The fan starts to rotate faster and makes a moderate noise. We take the dBA meter, move away 75 CM and then aim the device at the active fan on the graphics card. We measure roughly 50-53 dBa which is to be considered a quiet to moderate noise level coming from the PC yet also a norm as most of our tests end up at this sound level."

FEAR 19X12 4X16X:
http://www.guru3d.com/article/Videocards/353/11/
72fps for XFX GX2 with FX-62 CPU (78fps for 7900GTX SLI)
http://www.guru3d.com/article/Videocards/332/14/
And there's a $900+ Crossfire rig running the same bench at 59fps on a FX57. P3wned.

Originally posted by: nitromullet
Subpar compared to what? For there to be subpar, there has to be par first. Seeing as how there is no card out that can do HDR+AA faster, it would appear that the Radeon sets the standard. Also, based on the 8xAA and 14xAA/16xAA numbers in xbit's review of the 7950GX2 and the absolute unbridled ass beating the Crossfire rig gives NV's finest at higher AA levels (weren't you complaining about ATI's AA earlier this week?) words like "subpar" should probably be used sparingly when describing Crossfire.

With one X1900XTX it's only powerful enough to run 12X10 resolution in Oblivion w/HDR+AA, and very few people with X1900XTXs have monitors that are not LCDs with 16X10+ resolution.

So they have the choice of interpolating the the resolution down, which causes distortion, or not using part of their screen, which causes anger.
See above reply for more examples of subpar X1900 performance.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Like I said, it should be faster Crusader, it doesn't look as good. Period. If I were to disable my HQ AF, my HDR + AA, my X1900XTX would be getting along better than a GTX. However, I choose to have my game look better and play fine. Yes, it will not get frames like the 7950 because the 7950 has two GPU's--it should win for crying out loud. Fact is, it is the 6800 with a smaller die, higher clocks (but not as high as other 7900 cores), and another butt buddy attached to it to help it do its work. I've already seen what this card can do. My 7800GT's that were volt modded performed on par with stock 7900's so I basically had a stock 7900GT SLI which is very comparable to this card. I'm still more impressed with my X1900 than that simply because I get very playable frame rates and better graphics.

Its nice that you can argue with monkey, but do realize that you're the first one who came into a thread where the OP was asking a technical question and you began to recommend a different card? Do you ever know how to answer someones question without telling them to bleed their bank account? That says something you know.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Im not recommending a different card. Im suggesting he get the 7950 hes asking about because its an industry leading product that is simply unmatched and unanswered by ATI. Theres no point to ATIs HDR+AA with only one attractive game and not enough horsepower to push it at 16x10 or 19x12.

See above reply with benchmarks showing the X1900s extreme power consumption, lackluster HDR+AA performance and deafening noise output.
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
I think it has something to do with the drivers, but both work on the source engine, so I guess that's the only engine with that feature.
 

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
Originally posted by: Crusader
Theres no point to ATIs HDR+AA with only one attractive game and not enough horsepower to push it at 16x10 or 19x12.

You mean sort of like how the Geforce 6 series was dog slow when HDR was enabled?? I didn't see people have trouble recommending those cards for the advanced featureset even though it was beaten by the X850 in many games...

See above reply with benchmarks showing the X1900s extreme power consumption, lackluster HDR+AA performance and deafening noise output.

Which of the links you provided test HDR+AA to back up your claim of lacklustre HDR+AA performance?? There are plenty of people that do HDR+AA in more than one game just on this board and several of them have said it is very playable.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: GundamSonicZeroX
Well, it is 2 cards in one.

Originally posted by: GundamSonicZeroX
Oh well, I'll just wait for G8s then.

Originally posted by: GundamSonicZeroX
I don't really like ATI.

Originally posted by: GundamSonicZeroX
Thanx fot the thread hijacking. Anyway, I always hated ATI ever since I got 3 DOA 9200SEs.

Originally posted by: GundamSonicZeroX
I have an Nvidia board anyway. I won't be able to use crossfire.

Originally posted by: GundamSonicZeroX
ATI branded cards. And THEN a DOS from Xtasy. ATI has to really PWN Nvidia before I change sides.

Originally posted by: GundamSonicZeroX
My loss? More like my gain. I don;t want any more burnt iut cards!

Originally posted by: GundamSonicZeroX
I wish I knew the release date of that card though :(

Originally posted by: GundamSonicZeroX
Gee thanx for the good luck. And the cards were tested on many many systems (I work at a locally owned PC repair shop) but like my relationship with AMD "THe competition will have to put out a product that really kick some ass before I switch over."

Originally posted by: GundamSonicZeroX
Originally posted by: Lonyo
Originally posted by: GundamSonicZeroX
Gee thanx for the good luck. And the cards were tested on many many systems (I work at a locally owned PC repair shop) but like my relationship with AMD "THe competition will have to put out a product that really kick some ass before I switch over."

Make sure you don't buy an EVGA card then. Or BFG or XFX.

I only buy Gigabyte or eVGA cards.

And you say

Originally posted by: Crusader
Im not recommending a different card. Im suggesting he get the 7950 hes asking about
Where is he asking about it? Truth is, he's waiting for G80 like a lot of us.

because its an industry leading product that is simply unmatched and unanswered by ATI.

Gotta make your quota? It isn't unmatched, in fact ATI's is right now unmatched considering Nvidia can not do HDR + AA or angle independent AF. Until Nvidia begins supporting current features, they are not comparible in any benchmark becuase it is running a dated AF and an almost unusable HDR + AA method (Source is the only time they can be both used). I wouldn't buy a game that cannot use AA even if people say, that game will give you better frames. Of course it would, it isn't pushing modern features. Same thing with Nvidia here. Its pretty sad when it can only do what the 6800 does only with better clocks and performance.

Theres no point to ATIs HDR+AA with only one attractive game and not enough horsepower to push it at 16x10 or 19x12.

See above reply with benchmarks showing the X1900s extreme power consumption, lackluster HDR+AA performance and deafening noise output.

Once again you distort the truth. You don't own an ATI card, I do. 1680*150 is fine with HDR + AA. I get just about the same as if I disable AA and only run HDR. 20 is the lowest it really gets outside. After taking a Sigil stone is when it gets painful. Care to tell us more lies Crusader.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Originally posted by: akugami
Originally posted by: Rangoric
Originally posted by: fierydemise
Originally posted by: GundamSonicZeroX
Thanx fot the thread hijacking. Anyway, I always hated ATI ever since I got 3 DOA 9200SEs.
So you had some really bad luck and now you limit yourself to one company who may or may not be best.
Logic FTW

I dunno sounds reasonable.

How many times must a company burn you before you won't use them anymore?

Not saying he's in the right about ATI but seriously that was a pretty good reason for him not to bother with them.

I had 3, yes 3, DOA Seagate laptop HD's. Last one worked great and is running fine. Musta been a glitch with a bad batch but I still buy Seagate HD's and have had zero problems with any other Seagate HD I own. Manufactoring processes are not 100% perfect and even rigorous testing will have some faulty products slip through the cracks.

Having bought 7 or 8 ATI cards over the last few years for just systems in my house, and about 3 nVidia cards, I can say both ATI and nVidia are relatively decent quality control wise. And on the flip side, EVGA had problems with their initial runs of 7900GT's, does that mean EVGA (one of the better video card companies) suck? Should everyone now stay away from EVGA hardware?

Originally posted by: GundamSonicZeroX
I have an Nvidia board anyway. I won't be able to use crossfire.

You have guys arguing "bang for the buck" here and mentioning SLI, GX2's, Crossfire. News Flash: They all suck when considering bang for the buck. Why? For twice the cost (assuming you already have a compatible motherboard), you do not get twice the performance. My personal advice if you were upgrading now is, 7900GT if you want to take the risk of volt modding and overclocking your card to heck, X1800XT 512MB if you can find those at a good deal and doing the same or X1900XT if you don't want to overclock. The 7900GTX performs about on par in most things with the X1900XT but costs more so it's not even considered.

Originally posted by: GundamSonicZeroX
My loss? More like my gain. I don;t want any more burnt iut cards!

What gain? You're limiting yourself to products from only half of the industry. Why would it be your gain? I understand you had bad experiences with some ATI cards but even nVidia has had bad runs as evidenced by the recent EVGA, XFS woes. Which they already solved but tell that to all the 7900GT buyers who had to wait in line for an RMA card. Yes, they had so many bad nVidia cards that people had to wait in a queue for their cards. Not trying to scare you on nVidia hardware, the long wait for RMA cards was compounded by high initial demand on the 7900GT cards as well as, in EVGA's case, their step-up program. I'm not trying to push ATI hardware on you, just saying to keep an open mind. Ultimately the decision on what to buy is yours and if you're happy with it, who am I to argue with you that your decision is wrong.

Originally posted by: GundamSonicZeroX
I wish I knew the release date of that card though :(

On one of the rumor sites but someone said they saw an ATI roadmap up to Nov and the R600 core was nowhere in sight. Usually nVidia and ATI releases hardware with a month of each other unless a manufactoring or other glitch comes up as was the case in the X1800 launch. There are refresh cards from ATI and nVidia coming up in late summer I believe which doesn't bode well for a quick release of the next series of cards, figure at least 3 months after the refresh release. So...Nov at the earliest, and quite possibly not until early 2007 to coincide with the launch of Windows Vista since both the G80 and R600 are touting DX10 support.

EDIT: Corrected a typo which made a sentence look funny.
 

secretanchitman

Diamond Member
Apr 11, 2001
9,353
23
91
only reason i prefer nvidia over ati is their superior drivers. i absolutely cannot stand the bloated ati catalyst drivers...yes, i know there is ati tray tools or the omega drivers and whatnot, but those are extra stuff i have to install, unlike nvidias where its just 1 driver for every card (minus the TNT2, geforce 2, etc) and their drivers are clean, streamlined and easy to use.

thats why.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Ok Josh cool it the op put in the topic heading Can a GeForce 7950 do HDR+AA. We all said it cannot. The op then said oh well I will just wait for the G80. The op clearly likes Nvidia stuff better so why don't we help him with that. Crusader pretty much said don't wait for G80 because it looks to be a ways off go with the 7950GX2, which is the most powerful card right now. The only one really pushing cards the op won't care about are you and the rest of the ATI fans.

I do like the x1900xt/xtx but that doesn't mean I am going to shove it down everyones throat like you seam to want to do. The fact is the op probably will never switch to ATI untill an Nvidia card or cards let him down. So instead of fighting an uphill battle try and point him in the best direction.


To the OP:
I usually don't agree with crusader but for this op the 7950GX2 would be a good card to enjoy for about 6 or so months untill the G80 is here. It will have speed like you won't believe but it will look just like any other 7series card.

If you want next gen hardware that is ATI right now, but if you would rather lick the wet crack of Roseanne's rear end then I would have to tell you to wait it out untill the G80. If you can't wait for about 6 months then go with the 7950GX2 I hope this helps.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
With one X1900XTX it's only powerful enough to run 12X10 resolution in Oblivion w/HDR+AA, and very few people with X1900XTXs have monitors that are not LCDs with 16X10+ resolution.

So they have the choice of interpolating the the resolution down, which causes distortion, or not using part of their screen, which causes anger.
See above reply for more examples of subpar X1900 performance.

that's flat out bullsht. You are lying. XTX runs oblivion smooth as hell at 1680x1050, and has fantastic minimum framerates as well.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: redbox
Ok Josh cool it the op put in the topic heading Can a GeForce 7950 do HDR+AA. We all said it cannot. The op then said oh well I will just wait for the G80. The op clearly likes Nvidia stuff better so why don't we help him with that. Crusader pretty much said don't wait for G80 because it looks to be a ways off go with the 7950GX2, which is the most powerful card right now. The only one really pushing cards the op won't care about are you and the rest of the ATI fans.

I never suggested the X1900 to the OP. I simply brought up the fact that Crusader's remarks concerning its performance with Oblivion were wrong. You know I get good frames in Oblivion with HDR + AA @ 1680 x 1050. I was simply talking about it in regards to Crusader's comments. Whether the OP buys a 7950 or a G80 or a X1900 I could care less about.

I do like the x1900xt/xtx but that doesn't mean I am going to shove it down everyones throat like you seam to want to do. The fact is the op probably will never switch to ATI untill an Nvidia card or cards let him down. So instead of fighting an uphill battle try and point him in the best direction.

I'm not trying to shove it down his throat. Once again, I was using it in a context that was a response to Crusader.


To the OP:
I usually don't agree with crusader but for this op the 7950GX2 would be a good card to enjoy for about 6 or so months untill the G80 is here. It will have speed like you won't believe but it will look just like any other 7series card.

I agree with that. Just not the fact that someone thinks it is better than the X1900 because I believe they give different things. Current features that are most relevant to games are not supported on it and therefore do not get my vote, no matter how fast it performs.

If you want next gen hardware that is ATI right now, but if you would rather lick the wet crack of Roseanne's rear end...

That's just wrong and you know it:p

then I would have to tell you to wait it out untill the G80. If you can't wait for about 6 months then go with the 7950GX2 I hope this helps.

I didn't know the OP is wanting to buy a card right now (he was simply asking if Nvidia has done anything different with this, beings how it is a new high end card, from its normal 7 series and enabled HDR + AA--which, they haven't). From the sounds of it, he is content with waiting for G80, like I'm going to do. If the guy wants to wait, then let him wait...I'm not trying to shove a card down his throat or tell him Nvidia is worthless (they're certainly not). I'm simply saying that the X1900 is a much better card than Crusader makes it out to be, so take his advice with a big bag of salt. The 7950 is impressive, but when I look at how many other options Nvidia has provided, I can't see its worth in gold.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Originally posted by: secretanchitman
only reason i prefer nvidia over ati is their superior drivers. i absolutely cannot stand the bloated ati catalyst drivers...yes, i know there is ati tray tools or the omega drivers and whatnot, but those are extra stuff i have to install, unlike nvidias where its just 1 driver for every card (minus the TNT2, geforce 2, etc) and their drivers are clean, streamlined and easy to use.

thats why.

Correction. ATI's drivers are roughly on par with nVidia's. It's ATI's CCC that is bloated. Big difference. If you download just the drivers alone they are not that bad. But of course you lose the ability to change the settings in your graphics card unless you choose to install something like ATI Traytools.
 

hemmy

Member
Jun 19, 2005
191
0
0
Originally posted by: munky
Originally posted by: Nvidiot
the GX2 is a killer deal for bang/buck and the implementation is very slick.

if you go x1900 crossfire you are still going to get subpar performance, and only get HDR+AA in one currently popular game which is Oblivion. And that even requires a 3rd party hack to get working.

If you appreciate Nvidia software support and hardware, I'd get yourself a GX2 and when they enable quadSLI you will have some true stomping power.

You could buy two X1900XTs, a Crossfire motherboard and use that dongle to connect the very loud and noisy, high power consumption video cards and still get beat by 10FPS in FEAR@[url="http://www.legitreviews.com/article/354/8/"]19x12[/url] or just get yourself a GX2 and enjoy life.
They say two GX2s in quadSLI would take less power than X1900 Crossfire!

Waiting for G80 isnt a bad idea either. But the reality on the HDR+AA game situation is harsh, and current hardware besides the GX2 isnt fast enough to push it at high resolutions.

You forgot to mention that, according to you, in the only game that supports HDR+AA, the gx2 still loses to a single XTX. What were you saying again about current hardware and HDR+AA?
And, the gx2 is just about as loud, hot and powerhungry as a XTX. So, does that mean you're now gonna bash it too for sounding like a jet engine, melting your desk, and requiring its own powerplant, or have you now shifted that definition to dual x1900's now that the green team isn't running so cool anymore?

http://www.xbitlabs.com/articles/video/display/nvidia-gf7950gx2_15.html
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
No AA benches in that link, hemmy. A lot of people will chose AA, over HDR given the choice between the two. Of course, ATi X1K owners do not need to decide, as they can do both, with HQ AF to give an even better picture. All with playable frames.

Showing only HDR numbers, and not AA numbers, is only given half the story. People need more numbers, and deserve more, and reviewers need to be less lazy about it.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: hemmy
Originally posted by: munky
Originally posted by: Nvidiot
the GX2 is a killer deal for bang/buck and the implementation is very slick.

if you go x1900 crossfire you are still going to get subpar performance, and only get HDR+AA in one currently popular game which is Oblivion. And that even requires a 3rd party hack to get working.

If you appreciate Nvidia software support and hardware, I'd get yourself a GX2 and when they enable quadSLI you will have some true stomping power.

You could buy two X1900XTs, a Crossfire motherboard and use that dongle to connect the very loud and noisy, high power consumption video cards and still get beat by 10FPS in FEAR@[url="http://www.legitreviews.com/article/354/8/"]19x12[/url] or just get yourself a GX2 and enjoy life.
They say two GX2s in quadSLI would take less power than X1900 Crossfire!

Waiting for G80 isnt a bad idea either. But the reality on the HDR+AA game situation is harsh, and current hardware besides the GX2 isnt fast enough to push it at high resolutions.

You forgot to mention that, according to you, in the only game that supports HDR+AA, the gx2 still loses to a single XTX. What were you saying again about current hardware and HDR+AA?
And, the gx2 is just about as loud, hot and powerhungry as a XTX. So, does that mean you're now gonna bash it too for sounding like a jet engine, melting your desk, and requiring its own powerplant, or have you now shifted that definition to dual x1900's now that the green team isn't running so cool anymore?

http://www.xbitlabs.com/articles/video/display/nvidia-gf7950gx2_15.html

And sadly, like the majority of review sites, xbitlabs doesnt know the difference between Quality and High Quality driver settings, and what kind of IQ optimizations are enabled by running NV cards with default driver settings. The article I linked to not only set the driver options to HQ for all cards, but also used AAA or transparency SSAA in their benches. So you can get a clear picture of how the cards really perform when running with all IQ options enabled, and neither xbitlabs nor techreport seem to be aware of that.