nv Geforce FX 5950 Ultra vs ATi 9800XT

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
I think the Geforce FX was way better even though it was way slower. It had much better IQ feature set (better AA, superior filtering, and the w-buffer which a lot of games were designed to use) and it ran a lot cooler (at least the Gainward 5900 Ultra that I had in a custom system by the now defunct Monarch did; I don't know how cool the 5950 Ultra ran).

Am I the only one who thinks that R300 sucked?

I had one in a custom Hypersonic PC and the 9700 Pro sucked so bad that I sold the whole machine. I got the monarch system mentioned above with a Geforce FX 5900 Ultra a year later and it was a lot better than the R300 I had for 4 months until, of course, the 50 series drivers came out. The Geforce 6800GT came out the same time as the 60 series drivers in which you could finally disable the trilinear optimization that was forced with the entire 50 series drivers (it wasn't forced with the 40s so I just used those most of the time I had the Geforce FX unless a game I played required the 50s) and so since the 6800GT was the new generation and not expensive for what it was, I got that to replace the 5900 Ultra. Unfortunately, 2 years later the HSF was clogged with dust, the card overheated, and couldn't be used without graphical artifacts.

The thing that was stupid about the 6800GT was that RGBA16FP back buffer didn't work with MSAA and the fact that it still didn't have any 32 bit Z-buffer formats. It also didn't have the wbuffer or the 2x RGSS AA mode that the Geforce FX had. AF was useless with it because the shimmering was horrible with AF on.

So I actually think the Geforce FX was probably the best nvidia product lineup before the 8 series. Of course, if all I cared about was performance, then the FX5900 ultra would not have satisfied me. But since I'm an IQ freak, I think it was great for the most part.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Am I the only one who thinks that R300 sucked?

Yes, you are. The Geforce FX series was a disaster for Nvidia, their follow ups in the 6xx0 and 7xx0, and 8xx0 series they released later.

But, its entirely a moot discussion at this point. A 50 dollar video card today will get you more performance than any R300 or Geforce FX part.
 

JBT

Lifer
Nov 28, 2001
12,095
1
81
Wasn't the 9700 Pro double the performance of any of its competition for over a year with AA enabled?
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
I agree




with your self-assessment as an IQ freak. NVIDIA FX series probably gave you a beautiful MYST experience. ;p
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I had a Radeon 9800 Pro and a GeForce 5900 and both had their strengths.

The Radeon did have an impressive feature set but their strengths were the quality on their polygon edges and the raw performance, specifically anti-aliasing and filtering combined.

The GeForce 5900 also had an impressive feature set but did like the hybrid, mixed modes and their image sharpening feature.
 

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
Let's instead discuss the merits of 256mb of ram vs 512mb. :D

I'll start:
It's a total waste people. 512mb is unnecessary. Name ONE game that uses >545mb of ram at >51.97fps!! I dare you!!

:D
:p
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Neither. You bough the 9700pro (or 9500pro unlocked) the year earlier and were happy with performance seriously double what NV had.

Any real gamer at that time knew R300 smoked GFX entirely. If you cared about IQ, the R300 could enable AA/AF and keep most of it's performance while the GFX tanked.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Am I the only one who thinks that R300 sucked?

:rolleyes:

FX5x00 series DX9 performance = COMPLETE FAIL.

half&

5514.png

5515.png

Source

FX5x00 series 8x AF performance = Poor

ut3.gif

halo2.gif

tron3.gif

tr3.gif

tr1-3.gif

aq3.gif

Source
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
FX5x00 series performance in next generation "Shader intensive" games = Poor

tron_1600_candy.gif

dx1sf_1600_pure.gif

dx2danger_1600_candy.gif

dx2escape_1600_candy.gif

farcry_1280_pure.gif

pkiller_1600_candy.gif

traod_1280_candy.gif


Nvidia's FX5x00 series was arguably the worst series ever produced by the company, both in terms of image quality and performance vs. the competition at the time. This is because its performance was often not adequate enough to enable the best graphical features in games or AA/AF filters.
 
Last edited:
Feb 19, 2009
10,457
10
76
Wow those graphs are full of lolz. I never realized the FX5900 was THAT BAD.

I owned a 9700pro and played BF2 with it, then a 800XT-PE playing BF2142, avoided anything NV until 8800GT/S (which were awesome).
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Wow those graphs are full of lolz. I never realized the FX5900 was THAT BAD.

Neither did I. Good thing my entry into PC gaming was in 2005, not 2003 or whenever the 5 series came out. However I would've rather had a FX5900 than the Intel Xtreme Graphics 2 I had to suffer with in my laptop in 2005! ;)
 

fralexandr

Platinum Member
Apr 26, 2007
2,244
188
106
www.flickr.com
i thought the main reason the fx couldn't handle half life 2 was a color depth issue. if you decreased the color depth from 32 to 24bit or something like that, performance improved to acceptable levels?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
i thought the main reason the fx couldn't handle half life 2 was a color depth issue. if you decreased the color depth from 32 to 24bit or something like that, performance improved to acceptable levels?

NV used all kinds of excuses back then, but the more DX9 games came out, the worse their performance got. Good thing 6800 series was a good one to save their face.

"Once DirectX 9 is enabled, GeForce FX cards took a significant performance hit in our testing. On the high-end cards, GeForce FX 5950 Ultra performance dropped by a factor of two once the DirectX 9 path was enabled (versus RADEON 9800 XT’s 10-27%)." ~ FiringSquad

Half-Life 2 Video Stress Test
aa1280.gif


Counter-Strike Source
cs1280aa.gif


Source

Imagine what it must have been like at the time to get 2x the framerate in Counter Strike in DX9 on the 9800 series vs. 5900 series? Even under DX8.1 codepath, the 9800XT series was 26% faster than 5950....that's a big deal because 9800 actually achieved >60 fps in a game that was as popular as BF3 series is now! ^_^

Next time you see anyone making a reference to AMD pulling a 9700Pro/9800 series vs. Nvidia, just shake your head because never in history of GPU generations did ATI (now AMD) smoke NV's best card by 50-100% in next generation of games. I doubt this will ever happen again.

FX5800 series / 5900 series were FAR bigger failures than 2900XT or 3870. The only saving grace for nV at the time was good performance of those cards in flight sims and OpenGL games (AMD's OpenGL performance didn't get good until HD5800 series at which point AMD actually surpassed NV).
 
Last edited:

PingviN

Golden Member
Nov 3, 2009
1,848
13
81
The FX-series was a joke. Performance was pathetic at best and it's quite possibly the worst series of GPUs ever released.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
i thought the main reason the fx couldn't handle half life 2 was a color depth issue. if you decreased the color depth from 32 to 24bit or something like that, performance improved to acceptable levels?
It was partially due to the Geforce FX having 1/2 the number of pixel shader units that the 9700 pro did but also in part due to what you pointed out. As you point out, if you decreased the Pixel Shader precision from 32 bit to 16 bit, then the performance was okay. But at least the Geforce FX gave the option for 32 bit pixel shader precision even if the performance was awful at 32 bit pixel shader precision. ATi only gave 24 bit pixel shader precision so of course their performance was better.

ATi didn't use full trilinear either (they used 5 bit, nvidia used the full 8 bit) and at that time their openGL drivers weren't that great.
 

GotNoRice

Senior member
Aug 14, 2000
329
5
81
Neither card is supported by the latest drivers, but with the 9800XT you at least have the option of using the 10.2 legacy drivers + latest CAPs on top.