geforce fx 5950

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: jgravance
Ive been reading up on this and it looks like rollo is right, because the Fx series uses 32 it has to go down to 16 to run HL2 because it cant run 24. Thats just what Ive gotten from reading, I dont know if its true or not.

You'd think with 32 bits they would get stuff to render correctly at least, but I doubt it would make a difference.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: jgravance
Ive been reading up on this and it looks like rollo is right, because the Fx series uses 32 it has to go down to 16 to run HL2 because it cant run 24. Thats just what Ive gotten from reading, I dont know if its true or not.

No it doesn't go down to 16bit FP Precision. It stays at 32. However, if you want full DX9 at a decent framerate and decent quality settings then forcing it to go down to 16bit isn't a bad idea but there are some noticable quality differences.

-Kevin
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Gamingphreek
Originally posted by: jgravance
Ive been reading up on this and it looks like rollo is right, because the Fx series uses 32 it has to go down to 16 to run HL2 because it cant run 24. Thats just what Ive gotten from reading, I dont know if its true or not.

No it doesn't go down to 16bit FP Precision. It stays at 32. However, if you want full DX9 at a decent framerate and decent quality settings then forcing it to go down to 16bit isn't a bad idea but there are some noticable quality differences.

-Kevin

IMO, using DX8.1 is better than this. Seems to me there are some "irregularities" when you force 16bit.

Anyway that's part of the big myth of the "terrible FX shaders", they're not as bad as they seem, they just have to do much more work.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: jgravance
i want vivo because im looking to put all my old vhs on dvds, so it is pretty important.

Do you have a digital camcorder? If so, it may have digital pass-thru. Let's you play a VCR tape, send thru the camcorder, out Firewire to your PC. Makes bringing over VCR video pretty easy.
 

Fenuxx

Senior member
Dec 3, 2004
907
0
76
Don't do it unless you know what your getting yourself into. The FX's don't do DX9 very good, which may make you steer away from it.


P.S.

POST #200:beer: :D :p :) ;) :cookie:
 

blckgrffn

Diamond Member
May 1, 2003
9,687
4,348
136
www.teamjuchems.com
Wow!

Have any of you used a 5900 and a 9800 Pro side by side? I have! There are times when I prefer the nvidia, and times when I would have preferred the ATI. For instance, my dads 5900 had far fewer driver issues, especially with older games, than my 9800 Pro. By no means are the 5900's "pooh". Frankly, the newest games don't run well on either of them, by which I mean high resolutions or lots of AA/AF at 1024*768. Both of them might as well never had dx9 support for all it brings to the table and how much performance suffers on both of them when enabled.

Bottom line - you need to be buying current generation to get worthwhile dx9 support. That 5950 will be great in most games, and decent in the latest. For $145, that is a decent deal. If I was in need of a video card for a LAN box or something of the sort, I would not hesitate to snap that up.

My $.02. :)
 

eBauer

Senior member
Mar 8, 2002
533
0
76
Originally posted by: Rollo
Originally posted by: gobucks
actually, the FX series does have problems with full DX9 compliance. That is why HL2 runs in DX8.1. It is possible to run in DX9 mode, but things like water don't render correctly, and performance is abysmal. I think the FX series uses 16-bit pixel shaders that emulate 2.0, while ATI uses the official 24-bit shaders, and nvidia's 6 series uses full 32-bit shaders (3.0).

As for VIVO, aren't there 9800 Pros that have VIVO? I'd rather have a 9800 Pro than a 5950 Ultra, considering how sloppy the FX architecture is, and how horribly it holds up in newer games compared to the 9800 series. Another great suggestion would be kmmatney's since apparently MSI makes a 6600GT VIVO.

Sigh.

The FX series uses 32 bit precision, ATI uses 24, resulting in lower performance. It only uses 16 when specially coded to do so.

I feel your pain - this misinformation gets sickening.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: eBauer
Originally posted by: Rollo
Originally posted by: gobucks
actually, the FX series does have problems with full DX9 compliance. That is why HL2 runs in DX8.1. It is possible to run in DX9 mode, but things like water don't render correctly, and performance is abysmal. I think the FX series uses 16-bit pixel shaders that emulate 2.0, while ATI uses the official 24-bit shaders, and nvidia's 6 series uses full 32-bit shaders (3.0).

As for VIVO, aren't there 9800 Pros that have VIVO? I'd rather have a 9800 Pro than a 5950 Ultra, considering how sloppy the FX architecture is, and how horribly it holds up in newer games compared to the 9800 series. Another great suggestion would be kmmatney's since apparently MSI makes a 6600GT VIVO.

Sigh.

The FX series uses 32 bit precision, ATI uses 24, resulting in lower performance. It only uses 16 when specially coded to do so.

I feel your pain - this misinformation gets sickening.

Read this article on dx9 performance: Linky

Then tell me who's spreading misinformation.
 

Gentle

Senior member
Feb 28, 2004
233
0
0
I think that the article pretty much says that Nvidia NV3X hardware (FX 5X00 series) has a problem with DirectX 9 software performing slowly due to adherence to the DirectX 9 specifications.

Gentle
 

eBauer

Senior member
Mar 8, 2002
533
0
76
Originally posted by: munky
Originally posted by: eBauer
Originally posted by: Rollo
Originally posted by: gobucks
actually, the FX series does have problems with full DX9 compliance. That is why HL2 runs in DX8.1. It is possible to run in DX9 mode, but things like water don't render correctly, and performance is abysmal. I think the FX series uses 16-bit pixel shaders that emulate 2.0, while ATI uses the official 24-bit shaders, and nvidia's 6 series uses full 32-bit shaders (3.0).

As for VIVO, aren't there 9800 Pros that have VIVO? I'd rather have a 9800 Pro than a 5950 Ultra, considering how sloppy the FX architecture is, and how horribly it holds up in newer games compared to the 9800 series. Another great suggestion would be kmmatney's since apparently MSI makes a 6600GT VIVO.

Sigh.

The FX series uses 32 bit precision, ATI uses 24, resulting in lower performance. It only uses 16 when specially coded to do so.

I feel your pain - this misinformation gets sickening.

Read this article on dx9 performance: Linky

Then tell me who's spreading misinformation.

I'm talking along the lines of "problems will full DX9 compliance" (it is fully compliant, just slow), and "ATI Radeon 9800 Pro 128MB that runs DirectX 9 games 10x better then the FX 5950"

Yes, slower, but not 10x slower :) The link you gave showed a 5900 @ 30FPS and a 9800 Pro @ 60FPS. 2x slower.

 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
You are trying to argue a moot point.

The NV3x generation of graphics cards defaults to 32bit FP (Floating Point) Precision. ATI defaults to 24bit FP Precision. For all intents and purposes there is no IQ difference, however 32bit FP precision allows Nvidia to comply with part of SM3 spec and supports a few other features, nothing spectacular for that generation. However, when Nvidia built the NV3x cards they centered it around the Geforce 4 architecture which ran exceptionally well in DX8.1 and OGL. Additionally trading somethings off, Nvidia opted to use a method of Anisotropic Filtering in which they lose and ALU (Arithmetic Logic Unit) in the process whereas ATI does not. Resulting in not only pure AF performance but because the the former problem poor DX9 performance.

One "tweak" that was found for the NV3x cards was a compatibility mode. BY using this mode the card was forced to run with 16bit FP Precision. Unlike 24 compared to 32 this did result in an IQ loss in some areas however. What did you seek to accomplish by posting that link? All it did was reinforce the point we have been making all the time.

Finally that links results are skewed. You are running on non-optimized drivers, based on an ATI and Valve tech demo. Look at AT benchmarks and they look a little more favorable, no where near decent DX9 performance but still better than what you are seeing there.

-Kevin