- Apr 12, 2004
- 3,478
- 1
- 76
Originally posted by: nemesismk2
It looks like I made the right choice in getting a 6800 GT, it gives me ATI beating Doom3 performance and when the new 70 drivers from nvidia are released ATI beating Half Life 2 performance as well!![]()
Originally posted by: PrayForDeath
Originally posted by: nemesismk2
It looks like I made the right choice in getting a 6800 GT, it gives me ATI beating Doom3 performance and when the new 70 drivers from nvidia are released ATI beating Half Life 2 performance as well!![]()
How can you say that while they haven't posted X800 benchies yet?
Pretty accurate summary . . . :roll:Originally posted by: Dman877
Summery: Using DX9 in HL2 over DX8.1 on a FX halves your framerate while ATI cards take a 15 - 20% hit.
Once DirectX 9 is enabled, GeForce FX cards took a significant performance hit in our testing. For GeForce FX 5700 Ultra and 5600 Ultra, we witnessed performance declines of up to 2.5 times running the DirectX 8.1 path in a couple of cases with the video stress test. In comparison, RADEON 9600 XT?s worst-case scenario was a performance decline of 23% at 1600x1200 with 4xAA and 8xAF. On the high-end cards, GeForce FX 5950 Ultra performance dropped by a factor of two once the DirectX 9 path was enabled (versus RADEON 9800 XT?s 10-27%). Essentially, enabling the DX9 path with GeForce FX cards knocks your frame rate in half with Valve?s video stress test, the performance dropoffs are sometimes even worse for GeForce FX in Counter-Strike: Source beta. Just take a look at the trilinear benchmarks on page 7. It isn?t pretty for GeForce FX at 1024x768 and 1280x1024 with the DX9 path enabled.
Originally posted by: apoppin
Pretty accurate summary . . . :roll:Originally posted by: Dman877
Summery: Using DX9 in HL2 over DX8.1 on a FX halves your framerate while ATI cards take a 15 - 20% hit.
Once DirectX 9 is enabled, GeForce FX cards took a significant performance hit in our testing. For GeForce FX 5700 Ultra and 5600 Ultra, we witnessed performance declines of up to 2.5 times running the DirectX 8.1 path in a couple of cases with the video stress test. In comparison, RADEON 9600 XT?s worst-case scenario was a performance decline of 23% at 1600x1200 with 4xAA and 8xAF. On the high-end cards, GeForce FX 5950 Ultra performance dropped by a factor of two once the DirectX 9 path was enabled (versus RADEON 9800 XT?s 10-27%). Essentially, enabling the DX9 path with GeForce FX cards knocks your frame rate in half with Valve?s video stress test, the performance dropoffs are sometimes even worse for GeForce FX in Counter-Strike: Source beta. Just take a look at the trilinear benchmarks on page 7. It isn?t pretty for GeForce FX at 1024x768 and 1280x1024 with the DX9 path enabled.
LOL! :laugh: Yep, that's gotta hurt if you shelled out for a 5950. D3 isn't exactly spectacular on the FX series and HL2 just blows with the DX9 path. The last gen of ATi cards have definitely proven to have a better shelf life IMO.Originally posted by: Blastman
Heh, ? 9600XT is a lot faster than a 5950 in DX9.![]()
Originally posted by: Blastman
Heh, ? 9600XT is a lot faster than a 5950 in DX9.![]()
Originally posted by: Bumrush99
Very nasty results for the FX line.
Pretty clear that you are much better off with a 9800 PRO than a 5950 Ultra.
If your looking to upgrade I still think the 6800GT is the best route, especially since most of them OC to Ultra speeds.
Originally posted by: Rollo
I found that article disappointing and pointless, Shader day all over again. Here's why:
1. I don't want to see a 6800GT compared to a 9800XT. I want to see 6800/6800GT/6800U compared to X800Pro/X800XT PE. This is the relevant comparison, no one is buying 5950s anymore with 6800NUs <$300.
2. We all knew the nV3X series took it on the chin at HL2s straight DX9 path, we've known it for a year when the game didn't come out the first time. I think it has to do 32bit precision with a forced DX9 path, and that's going to be slower than 24bit. (correct me if I'm wrong)
It's nice that ATIs last gen runs DX9 so well, but I think the real comparison here is on current gen hardware. You can barely buy a nV35 anymore.
Note to last person in the world who hasn't seen this horse beaten till dead, reduced to subatomic particles:
Don't buy a nV3X card to play HL2 in DX9.
(if they ever release it)
Originally posted by: Rollo
2. We all knew the nV3X series took it on the chin at HL2s straight DX9 path, we've known it for a year when the game didn't come out the first time. I think it has to do 32bit precision with a forced DX9 path, and that's going to be slower than 24bit. (correct me if I'm wrong)
Originally posted by: jrphoenix
Pretty humbling to see the 9800 run just as fast give or take as my GT(especially since my rig is very similar to theirs, except I have an extra gig of ram
) I am going to get an Ultra in October to replace my GT... but still one can only imagine how much faster the x800's will be? Of course maybe my 16 pipes will pull away from the last gen card at 1600 x 1200 (that is the native resolution on my LCD and what I game at).
Pretty interesting read!
