Can a GeForce 7950

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Ok this is to help the op and anyone else out that are wanting to pick a card out right now. I believe users fall into sertain categories and I will number those with descriptions for said category.

The Buget speed segment. (not to be confused with the Budget IQ segment)
These useres want top end speed and are willing to sacrifice IQ to get it, just not money. They should have a monitor with at least 1280x1024.

The cards that would be good for this user are the:

Nvidia: 7900gt for around $300. or the 7600gt for $150-200
ATI:If that is too much step to the x1800xl 512 from MSI there is a thread in hot deals about it and they use the same memory the XT uses. The price on it is $200

The Buget IQ segment (not to be confused with The Budget Speed segment)
These users want pretty things on their games and don't care if it comes with a speed hit. They just don't want a hit in the wallet. They should have a monitor with at least 1280X1024

The cards that would be good for this user are the:

Nvidia:7900gt for around $300 good for turning up the AA and AF and still getting some speed but it misses HDR+AA
ATI: that x1800xl 512mb talked about in the budget speed segment. $200 for a card that will do HDR+AA at pretty good speeds down side you have to mess with the volts and overclock to get a 1800Xt. The 1800xt for around $300 makes a really nice Budget IQ card HDR+AA and the horsepower to pull it off nicely.

The Highend speed segment
These users have will have speed no matter what the costs. They should have monitors able to do 1680x1050 or higher.

Nvidia: 7900gtx or 7950GX2 both will give execcelent speed with the 7950GX2 beating everthing at the same IQ settings. They sell for between high $400-low $600. The 7950GX2 does HDCP but not HDR-AA the end users will have to evaluate and see what is important to them. HDR-AA is usefull now whereas HDCP has yet to see any use.
ATI: the 1900XT and 1900XTX these are just as fast in some games as the 7900GTX but the & 7950GX2 beats them both at same settings. they sell for high $300-high $400 depending on the card. They do offer HDR-AA but not HDCP users need to decide which one they like more.

The Highend IQ segment
These users have to have the best picture no matter what the costs. They should have monitors able to do 1680x1050 or higher.

Nvidia: I can't whole heartedly give a suggestion to nvidia because they lack IQ features that a users in this category really wants to have. That being said Nvidia looks to change this with their upcoming G80.
ATI: The x1900XT and X1900XTX hold the highend IQ crown. With excellent features like HQAF and HDR+AA they make the game look real pretty. They aren't as fast as their Nvidia counterparts but they make up for it in glamour and still provide playable frame rates.

THE GAMES
FPS like FEAR need about 40 fps to play nicely and about 60 to enjoy depending on how well you can see frames.
RPG like Oblivion need about 20fps to play nicely and about 40-50 to enjoy depending on how well you can see frames.

The user needs to keep this in mind when looking at benchmarks. The most important thing to look at is if your favorite game is getting playable frame rates at your desired settings. This said it would also be important to look at the newest game and also see if you will get playable framerates. This allows you to see just how long your hardware purchase will last.

Last but not least the VENDORS.
eVGA is the best I have come incounter with. BFG and XFX are also good.
ATI, Sapphire, Powercolor are also good.
the one thing I want to point out is that with any vendor it is a possiblity to recieve bad hardware. What you need to focus on is how that vendor makes good that(i.e. replacements quickly, easy tech support). However with so many board partners a users should not be turned off to a certain brand. Rather they should ask around and see what has been performing well. This is most often time the best way to gauge what cards are good at a certain time.

I hope this clears things up for the op and also some of our posters.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Munky..GX2 is the better performer than 1900xtx any way you put it.. 99% of reviews on the web agreed. Oblivion is the ONLY game 1900xtx perform close to GX2, everything else 1900xtx just couldn't keep up with 7900gx2.

redbox: good job.




 

SonicIce

Diamond Member
Apr 12, 2004
4,771
0
76
Can someone show me HDR+AA screenshots in Farcry? All I find is some cello site
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Frackal
With one X1900XTX it's only powerful enough to run 12X10 resolution in Oblivion w/HDR+AA, and very few people with X1900XTXs have monitors that are not LCDs with 16X10+ resolution.

So they have the choice of interpolating the the resolution down, which causes distortion, or not using part of their screen, which causes anger.
See above reply for more examples of subpar X1900 performance.

that's flat out bullsht. You are lying. XTX runs oblivion smooth as hell at 1680x1050, and has fantastic minimum framerates as well.

Originally posted by: Ackmed
No AA benches in that link, hemmy. A lot of people will chose AA, over HDR given the choice between the two. Of course, ATi X1K owners do not need to decide, as they can do both, with HQ AF to give an even better picture. All with playable frames.

Showing only HDR numbers, and not AA numbers, is only given half the story. People need more numbers, and deserve more, and reviewers need to be less lazy about it.

http://www.xbitlabs.com/articles/video/display/nvidia-gf7950gx2_15.html
I don't consider 22fps minimums at 16X12 HDR 0X 16X playable, but hey, if you like to count the frames.....
BTW- think that heads to the teens if you add AA? I do.

http://www.firingsquad.com/hardware/sapphire_radeon_x1900_gt/page11.asp
Darn! The X1900XT can only muster a 25fps average outdoors with 16X12 HDR 0X 8X- so you know that one is already in the teens for minimums. (where does AA put it, single digits)

So while Ackmed is right that ATI users don't have to choose between HDR and AA on current ATI products, they do have to choose between HDR+AA and stuttering, jerky video.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: munky
And sadly, like the majority of review sites, xbitlabs doesnt know the difference between Quality and High Quality driver settings, and what kind of IQ optimizations are enabled by running NV cards with default driver settings. The article I linked to not only set the driver options to HQ for all cards, but also used AAA or transparency SSAA in their benches. So you can get a clear picture of how the cards really perform when running with all IQ options enabled, and neither xbitlabs nor techreport seem to be aware of that.

Yes, it is sad. Especially when NV takes a larger hit going to Q to HQ, than ATi does. And NV shimmers pretty badly depending on the setup with Q and not HQ selected.

Originally posted by: Crusader

So while Ackmed is right that ATI users don't have to choose between HDR and AA on current ATI products, they do have to choose between HDR+AA and stuttering, jerky video.

False, as I, and many others (even in this topic) have stated, we get very playable frames with HDR+AA, and other options maxed. The link you dropped, Foliage, is probably the most demanding part of the game, of course you would show that. Its also not a very played one, so good effort on the misdirection. That link also shows the XT, not the XTX, or Crossfire numbers. Yet you claim that all ATi users do not get playable frames, based on the XT and the lowest number benchmark you could find. The previous page shows 43 frames, much higher than the one you linked. Imagine that. Lookie here Foliage very playable, and faster than quad SLI in Oblivion, at 1920x1200, HDR and AA, at 44fps. The mountains and indoor section (most of what is played) of FS's Oblivion benchmarks show much, much higher numbers, yet you only provide a link to the lowest frames you can find. Again, imagine that. Oblivion is not a typical FPS shooter. You do not need 60+ frames to get an optimal game.

The simple fact is, ATi can do it, and NV cant, and thats why you're upset, and try to downplay it. If it was the other way around, you would be all for it.

 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: Crusader
Originally posted by: Frackal
With one X1900XTX it's only powerful enough to run 12X10 resolution in Oblivion w/HDR+AA, and very few people with X1900XTXs have monitors that are not LCDs with 16X10+ resolution.

So they have the choice of interpolating the the resolution down, which causes distortion, or not using part of their screen, which causes anger.
See above reply for more examples of subpar X1900 performance.

that's flat out bullsht. You are lying. XTX runs oblivion smooth as hell at 1680x1050, and has fantastic minimum framerates as well.

Originally posted by: Ackmed
No AA benches in that link, hemmy. A lot of people will chose AA, over HDR given the choice between the two. Of course, ATi X1K owners do not need to decide, as they can do both, with HQ AF to give an even better picture. All with playable frames.

Showing only HDR numbers, and not AA numbers, is only given half the story. People need more numbers, and deserve more, and reviewers need to be less lazy about it.

http://www.xbitlabs.com/articles/video/display/nvidia-gf7950gx2_15.html
I don't consider 22fps minimums at 16X12 HDR 0X 16X playable, but hey, if you like to count the frames.....
BTW- think that heads to the teens if you add AA? I do.

http://www.firingsquad.com/hardware/sapphire_radeon_x1900_gt/page11.asp
Darn! The X1900XT can only muster a 25fps average outdoors with 16X12 HDR 0X 8X- so you know that one is already in the teens for minimums. (where does AA put it, single digits)

So while Ackmed is right that ATI users don't have to choose between HDR and AA on current ATI products, they do have to choose between HDR+AA and stuttering, jerky video.

I think Crusader, that you feel like 25 fps is not playable for this game. Fact is, it is NOT like a normal first person shooter where you really want 40-60 at least. 25-30 is very playable and nice with Oblivion.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Ackmed
Originally posted by: Crusader

So while Ackmed is right that ATI users don't have to choose between HDR and AA on current ATI products, they do have to choose between HDR+AA and stuttering, jerky video.

False, as I, and many others (even in this topic) have stated, we get very playable frames with HDR+AA, and other options maxed. The link you dropped, Foliage, is probably the most demanding part of the game, of course you would show that. Its also not a very played one, so good effort on the misdirection. That link also shows the XT, not the XTX, or Crossfire numbers. Yet you claim that all ATi users do not get playable frames, based on the XT and the lowest number benchmark you could find. The previous page shows 43 frames, much higher than the one you linked. Imagine that. Lookie here Foliage very playable, and faster than quad SLI in Oblivion, at 1920x1200, HDR and AA, at 44fps. The mountains and indoor section (most of what is played) of FS's Oblivion benchmarks show much, much higher numbers, yet you only provide a link to the lowest frames you can find. Again, imagine that. Oblivion is not a typical FPS shooter. You do not need 60+ frames to get an optimal game.

The simple fact is, ATi can do it, and NV cant, and thats why you're upset, and try to downplay it. If it was the other way around, you would be all for it.

Taking the BEST case scenario is MUCH more misleading to a consumer than taking the WORST case scenario. The simple fact is that you play ALL of the game, not just the parts where ATI can push HDR+AA.. tough scrutiny and facing the truth of the matter is the best representation of a product. Not letting it off easy turning a blind eye to cases where your favored product does poorly.
No one gives any other technology or product a free pass.. why do you give ATIs disgustingly slow HDR+AA a pass? You are trying to misconstrue the facts to sell more ATI cards on a checkbox feature.

43fps at Oblivion HDR+AA wouldn't convince me to spend $900 on two X1900XTXs, buy a Crossfire motherboard, void my warranty with a couple aftermarket coolers to escape the 58dB hair dryer noise, and at the end of it all be left with a solution that does not scale games well, especially at tiling. The one thing I'll give you is Super AA works well, but the rest of the Crossfire mess is just sub standard.
If nVidia put out a solution as bizarre and limited as Crossfire, you would be on every board, all day long talking about it Ackmed.
Besides this thread is about the 7950GX2 or X1900, not Crossfire.

Originally posted by: redbox
I think Crusader, that you feel like 25 fps is not playable for this game. Fact is, it is NOT like a normal first person shooter where you really want 40-60 at least. 25-30 is very playable and nice with Oblivion.

I'm agreeing with you, 25-30 is playable. But the link I provided is running HDR w/o AA applied, at 25FPS (which is on the bottom of your 25-30FPS being playable range). Note that 25FPS scoring is an average, not a minimum.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
You can provide all of the links you want, but when you openly state that the X1900 cannot get good frames with HDR + AA enabled @ 1680x1050 when I am sitting here and doing just the opposite of what you claim, I can't help but tell you that you are wrong. I don't care if you believe me or not, and I'm not saying that they didn't get those frames, but I do know that your biased summarization based upon something you have not experienced is something that needs to be commented on.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
WOW, does everybody here only game Oblivion?. I think there are way better games out there, but certainly wouldnt buy a card just for performance on 1 game!!

Crusader, stop fcuking dribbling about 7950/x1900 HDR AA shite. Josh, you are just as bad.

FTR - I too bawk at ATI as I have had 15 yrs IT experience supporting their ****** (>4yrs) hardware and crap drivers.

ATI however, have now got some wicked kit, unfortunaely their software is still ******, but there is plenty of custom drivers sets out there to use.

My 2c
 

Sonikku

Lifer
Jun 23, 2005
15,908
4,940
136
Originally posted by: fierydemise
Originally posted by: GundamSonicZeroX
ATI branded cards. And THEN a DOS from Xtasy. ATI has to really PWN Nvidia before I change sides.

It doesn't look like anything is going to change your mind so all I can say is your loss

You said it. :thumbsup:
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: SolMiester
WOW, does everybody here only game Oblivion?. I think there are way better games out there, but certainly wouldnt buy a card just for performance on 1 game!!

Crusader, stop fcuking dribbling about 7950/x1900 HDR AA shite. Josh, you are just as bad.

FTR - I too bawk at ATI as I have had 15 yrs IT experience supporting their ****** (>4yrs) hardware and crap drivers.

ATI however, have now got some wicked kit, unfortunaely their software is still ******, but there is plenty of custom drivers sets out there to use.

My 2c

Its naive of you to think I just game with Oblivion. That was simply a point in the discussion between me and Crusader which I am done with now. I only wanted to make it clear that the performance he was linking and suggesting one will get with those settings and an X1900 were false. It's fine if you think I'm just as bad as him, its your opinion. However I will not sit idle while someone passes complete FUD around on a thread. If he wants to continue further with it, he can PM me. Sorry for trailing off. You're right, arguing with him is useless.

I think you're also getting the ATI drivers confused with the ATI CCC, but correct me if I'm wrong cause I'm only assuming. They are not the same thing. In fact, ATI's drivers are doing pretty decent right now, and Nvidia's latest drivers are begining to function more like the CCC of ATI. Really though, I liked Nvidia's streamline drivers and how they easily meshed with the OS, yet I can't complain about ATI's either simply because I haven't had a problem with their drivers. Heck, with any card I end up getting some enhanced driver from a third source anyway so it doesn't really matter for me. Although, I know some people only want good supported things so.......
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: GundamSonicZeroX
My loss? More like my gain. I don;t want any more burnt iut cards!

ATI cards work fine. You just got bad luck.

Look at all the problems people have with 7900GT's and 7900GTX's. I haven't had a single problem w/ my X1900XT.
 

fbrdphreak

Lifer
Apr 17, 2004
17,555
1
0
Originally posted by: Crusader
the GX2 is a killer deal for bang/buck and the implementation is very slick.

if you go x1900 crossfire you are still going to get subpar performance, and only get HDR+AA in one currently popular game which is Oblivion. And that even requires a 3rd party hack to get working.

If you appreciate Nvidia software support and hardware, I'd get yourself a GX2 and when they enable quadSLI you will have some true stomping power.

You could buy two X1900XTs, a Crossfire motherboard and use that dongle to connect the very loud and noisy, high power consumption video cards and still get beat by 10FPS in FEAR@[url="http://www.legitreviews.com/article/354/8/"]19x12[/url] or just get yourself a GX2 and enjoy life.
They say two GX2s in quadSLI would take less power than X1900 Crossfire!

Waiting for G80 isnt a bad idea either. But the reality on the HDR+AA game situation is harsh, and current hardware besides the GX2 isnt fast enough to push it at high resolutions.
I hope dearly that no one is listening to this guy, after all his sig proudly explains his fanboy-ism:

Geforce 7- The cool, quiet, low power consumption solution with the best single slot and dual slot card available, superior multiGPU implementation and driver support in Windows and Linux, all while holding the performance lead.
As of today, both ATI and NV cards still shimmer.

Riva128/TNT/TNT2/GF/GF2/GF3/GF4/GF6/GF7=NV Domination over ATI. Sorry!

2.5ghz A64 - BFG 7900GTX - SATA Seagate RAID0 - SATA Plextor DL DVD Burner - NF4 DFI SLI - 2005FPW