• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GamePC: Geforce FX 5900 Ultra review

Not too bad, a step in the right direction anyways. I would like to see if better drivers could make a big difference with this card or not
 
I don't get it. How can the 5900 Ultra now be slower than the 9800 Pro in some benchmarks? The original reviews showed the 5900 Ultra to be far superior in speed...did ATI release a new driver?
 
Originally posted by: Malladine
I don't get it. How can the 5900 Ultra now be slower than the 9800 Pro in some benchmarks? The original reviews showed the 5900 Ultra to be far superior in speed...did ATI release a new driver?

there was suggestion that NVDA cards perform much worse when non standard ie custom timedemos were used to make sure no one was optimizing for game benchmarks and the 5900 performed much worse in custom timedemos over the default

beyond3d discovered this problem at first

there is a thread about it in Video in the NVDIA 3dmark 2K1 thread

and there are accusations of NVIDIA cheating game benchmarks

i dont know what to make of it unless we see more reviews to confirm this behavior

but i see no reason to to trust beyond3d's review

 
As for cooling power, the eVGA / nVidia oversized heatsink system does a respectable job, as we noticed temperatures ranging from 59° to 65° Celsius throughout testing with high levels of anti-aliasing and anisotropic filtering enabled. The heatsink fan will spin itself down to low speed mode once the GPU temperature cools down to under 40° Celsius. We found that after a nice long round of gaming on the GeForceFX 5900 Ultra, the cooling system takes anywhere from 3 to 5 minutes to get the GPU temperature back into normal levels.

Is gettin hot hot hot! I am surprised the 5900 Ultra loses to a lower clocked card in most of the benchmarks. Nvidia better pull a rabbit out of the hat with the nv4x.
 
59 to 65 is fine for a graphics card these days, from what I've heard. They have higher tolerances than CPU's, and CPU's even run at those temps.
 
I'm rather hoping to see the 256-bit NV35 core (is that right? 5900 = NV35?) trickle down to the 5200/5600 level of cards. That would be interesting to see.....
 
Yea, but I saw somewhere on here benchmarks for DOOMIII comparing the two, and the 5900 f-ed the radeon up the ass. I think its just directed more towards next gen gaming(DIII,HL2,DX2)
 
Originally posted by: CastorTr0y
Yea, but I saw somewhere on here benchmarks for DOOMIII comparing the two, and the 5900 f-ed the radeon up the ass. I think its just directed more towards next gen gaming(DIII,HL2,DX2)

That's because ID Software re-worked the Doom III demo to work best with NV35 just for that press release.
 
Originally posted by: CastorTr0y
Yea, but I saw somewhere on here benchmarks for DOOMIII comparing the two, and the 5900 f-ed the radeon up the ass. I think its just directed more towards next gen gaming(DIII,HL2,DX2)

Your probably right, but remember that nVidia specially tweak performance scores on what they think will be the most money making games 🙂. Nice strategy..but i doubt those scores will be achieved in reality when then 2 cards are compared.
 
Originally posted by: CastorTr0y
Yea, but I saw somewhere on here benchmarks for DOOMIII comparing the two, and the 5900 f-ed the radeon up the ass. I think its just directed more towards next gen gaming(DIII,HL2,DX2)

I simply find this hard to believe, since the DX9 feature have time and time again proven to be MUCH slower on the FX cards than on the Radeon cards...especially the shaders. I don't see the FX cards keeping up with the radeons in next-gen games nearly as well as they're keeping up now (which is all they seem to be doing...keeping up.) Doom III may certainly be an exception to this though...since NVidia is paying $5 million (at least, that's the last figure I heard) to be advertise with it, and likely to develop with it.

It seems to me, however, that the new features on the radeon cards are, in fact, it's strongest suit against the FX series.
 
Originally posted by: GTaudiophile
Originally posted by: CastorTr0y
Yea, but I saw somewhere on here benchmarks for DOOMIII comparing the two, and the 5900 f-ed the radeon up the ass. I think its just directed more towards next gen gaming(DIII,HL2,DX2)

That's because ID Software re-worked the Doom III demo to work best with NV35 just for that press release.


I don't know about that. The stencil shadow culling technique present on the NV35 core, coupled with the stencil and z-fill optimizations on the card make those Doom III benchmarks plausible*. I highly doubt Carmack would optimize this just for a press release. His history is to make generally unbiased statements about video cards, not lie about them for money.

*I write in OpenGL for a living, so I've done a fair amount of technical research into stencil shadows. If you want to argue about it, feel free 🙂


EDIT: The rest of AT's NV35 benchmarks are damn shady, though.
 
Back
Top