Anyone know if their are NEWER DOOM III Benches?

batmang

Diamond Member
Jul 16, 2003
3,020
1
81
Just wondering if their was some newer benchmarks on DOOM ]|[ with newer drivers from ATI. Cause I can remember from the last major doom benchmark, ATI's scores were kinda low due to drivers, I wonder if their any better with the newer drivers? Heh, looking back at the old drivers, ATI still scores good compared to how Nvidia scored on HL2. I think im gonna be buying a 9800 pro within the next couple months. Either that or wait it out, but this GFmx440's time is due.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I havent seen much in a months regarding Doom3 benchies. The last ones I saw had the 5900 Ultra beating the 9800 Pro pretty good. Something like 60 FPS vs 35 FPS on the 9800 Pro.

 

batmang

Diamond Member
Jul 16, 2003
3,020
1
81
yeah, there was a big difference just like there is now with HL2. So like ive stated before, its almost like they each picked a game and optimized their cards for each one. Nvidia has a 20-30 fps lead on the ATI cards in the Doom 3 benches, but in HL2 is the complete opposite.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Doom 3 isn't even set to launch until 2004. I wouldn't think performance of the D3 Alpha would be too indicative of final game performance with all goodies on. Just like HL2, take results with a grain of salt - there's still a ton of wiggle room until the final product is out.
 

Robor

Elite Member
Oct 9, 1999
16,979
0
76
Originally posted by: jiffylube1024
Doom 3 isn't even set to launch until 2004. I wouldn't think performance of the D3 Alpha would be too indicative of final game performance with all goodies on. Just like HL2, take results with a grain of salt - there's still a ton of wiggle room until the final product is out.
Plus there's a chance Doom3 could slip and we have a new generation of cards from both manf's between now and then. If HL2 benchmarks are meaningless the D3 benchmarks are beyond mention at this point.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Originally posted by: batmang
yeah, there was a big difference just like there is now with HL2. So like ive stated before, its almost like they each picked a game and optimized their cards for each one. Nvidia has a 20-30 fps lead on the ATI cards in the Doom 3 benches, but in HL2 is the complete opposite.
If you read the "carmack" thread in general hardware and the links there, you'll see the nVidia lead is because Carmack's design for the D3 engine mostly used the low-precision shaders that the nV cards are fast at, he didn't need the 24/32 bit shaders that H2 uses and which cripple the nV cards. In other words the D3 engine (while OpenGL) is more like a DX8 engine than a DX9 engine.

He did agree with Gabe / Valve though that future engines need the high-precision shaders, and that nV cards are slow for DX9 games in general not just H2.
 

Vadatajs

Diamond Member
Aug 28, 2001
3,475
0
0
Originally posted by: Genx87
I havent seen much in a months regarding Doom3 benchies. The last ones I saw had the 5900 Ultra beating the 9800 Pro pretty good. Something like 60 FPS vs 35 FPS on the 9800 Pro.

It's a shame that the 5900 had to run in reduced precision to do it. The spread would be the other way if they both ran the same ARB2 path (which is the way it should be).
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Originally posted by: Vadatajs
Originally posted by: Genx87
I havent seen much in a months regarding Doom3 benchies. The last ones I saw had the 5900 Ultra beating the 9800 Pro pretty good. Something like 60 FPS vs 35 FPS on the 9800 Pro.

It's a shame that the 5900 had to run in reduced precision to do it. The spread would be the other way if they both ran the same ARB2 path (which is the way it should be).
I like a good nV-bashing as much as the next guy, but my reading of the Carmack plan files (old & new) is that the D3 engine does not need 24/32 bit shaders to run the way he intended it to. The H2 engine's image quality is degraded with low-precision shaders, the D3 engine's is not. So nV owners just get faster performance, not badly reduced image quality like in H2.
 

Richdog

Golden Member
Feb 10, 2003
1,658
0
0
Originally posted by: DaveSimmons
Originally posted by: Vadatajs
Originally posted by: Genx87
I havent seen much in a months regarding Doom3 benchies. The last ones I saw had the 5900 Ultra beating the 9800 Pro pretty good. Something like 60 FPS vs 35 FPS on the 9800 Pro.

It's a shame that the 5900 had to run in reduced precision to do it. The spread would be the other way if they both ran the same ARB2 path (which is the way it should be).
I like a good nV-bashing as much as the next guy, but my reading of the Carmack plan files (old & new) is that the D3 engine does not need 24/32 bit shaders to run the way he intended it to. The H2 engine's image quality is degraded with low-precision shaders, the D3 engine's is not. So nV owners just get faster performance, not badly reduced image quality like in H2.


But the point is, Doom III NOT withstanding, is why should a video-card that costs $350-$400 and that touts itself as among the forefront of graphics technology even have to have games optimized for it, and not run acceptably with 24/32-bit shaders which are the default features and specs of DX9? Who cares if Doom III doesn't need these more advanced shaders to function properly or look it's best (which I have trouble believing in any case), it's just one game, and the other 1'000 that do need them are going to look decidedly worse on an Nvidia card. This has all gone beyond simple ATI/Nvidia fanaticism, which would be understandable if both cards performed equally well, and were stiff competition for each-other performance wise. But the facts are, running the games it was supposedly "designed and built for" i.e. in 24/32-bit and using full DX9 features, the FX series moves like an athlete with one leg. There is no more speculation now where FX performance is concerned, and no more secrets, the facts have been laid bare for all to see and as far as i'm concerned there can be NO exscuses. :beer:
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Robor
Originally posted by: jiffylube1024
Doom 3 isn't even set to launch until 2004. I wouldn't think performance of the D3 Alpha would be too indicative of final game performance with all goodies on. Just like HL2, take results with a grain of salt - there's still a ton of wiggle room until the final product is out.
Plus there's a chance Doom3 could slip and we have a new generation of cards from both manf's between now and then. If HL2 benchmarks are meaningless the D3 benchmarks are beyond mention at this point.

I agree with you there - HL2 benchmarks should be closer to the real thing, seeing as it is supposed to be released within the next few months (or, if you believe Valve, September 30th). Doom3 is a ways off...
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
http://english.bonusweb.cz/interviews/carmackgfx.html


Hi John,

No doubt you heard about GeForce FX fiasco in Half-Life 2. In your opinion, are these results representative for future DX9 games (including Doom III) or is it just a special case of HL2 code preferring ATI features, as NVIDIA suggests?

Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec.

John Carmack
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: batmang
yeah, there was a big difference just like there is now with HL2. So like ive stated before, its almost like they each picked a game and optimized their cards for each one. Nvidia has a 20-30 fps lead on the ATI cards in the Doom 3 benches, but in HL2 is the complete opposite.

That's basically the reason for the discrepancy in scores. nVidia added some specific hardware for Doom 3, and Carmack coded for it with his NV3x path. Valve coded "to" DX9, and the DX9 spec is based on ATi's DX9 hardware. Carmack also coded to ARB2, the OpenGL "equivalent" to DX9, and ATi also runs that path faster becuase, funnily enough, ARB2 is also geared toward hardware similar to ATi's R3x0 VPUs.

Anyway, I really didn't expect ATI to perform that much slower than the FX line in D3. I mean, the 5900 has an 8x0 architecture for D3's shadows (4x2 otherwise), and the 9800 is 8x1 all the time--you'd think they'd perform the same. But remember, the FX line is still clocked a lot higher than ATi's DX9 line. So, while they may have the same "fillrate," nV will be faster in games that are written to its hardware.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I mean, the 5900 has an 8x0 architecture for D3's shadows (4x2 otherwise), and the 9800 is 8x1 all the time--you'd think they'd perform the same.

I say that the same way regulalry, but do you know if anyone has figured out if the NV35 can work in 4x2+4x0 mode? For the reative performance gap in Doom3, I would imagine it isn't going to change considerably as although ATi can optimize their drivers, to the best of my knowledge Carmack still hadn't implemented US support in to D3 when we saw the last round of benches for D3.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
I'd just like to see the HL2 results with the Detonator 5x's since nVidia says that's the set they're been working on for HL2.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Originally posted by: Jeff7181
I'd just like to see the HL2 results with the Detonator 5x's since nVidia says that's the set they're been working on for HL2.
Read the Aquamark article on on the front page of TomsHardware and you'll see that their "work" has a curious "bug" that greatly increases speed while (surprise!) decreasing image quality (completely dropping parts of the scene).

Also read any of the Eidos/Tomb Raider and Carmack threads (here, General Hardware, other forums) to see that DX9 performance is just bad for nV cards, that it's not specific to HL2.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Jeff7181
I'd just like to see the HL2 results with the Detonator 5x's since nVidia says that's the set they're been working on for HL2.

They will be extremely high since nVidia dropped the IQ so low on those drivers (removing specific effects) that it's pointless.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: BenSkywalker
I say that the same way regulalry, but do you know if anyone has figured out if the NV35 can work in 4x2+4x0 mode? For the reative performance gap in Doom3, I would imagine it isn't going to change considerably as although ATi can optimize their drivers, to the best of my knowledge Carmack still hadn't implemented US support in to D3 when we saw the last round of benches for D3.

No clue, Ben. :) I wasn't even aware of a 4x2+4x0 mode. This B3D thread is interesting, though slightly over my head ATM: NV30,35 & R300/R350 Pixel Shader Pipes Compared (New inf
 

TVRFan

Junior Member
Sep 20, 2003
11
0
0
Remember Doom 3 has the NVidia the way to play it logo and HL 2 has the Ati logo on it. Its just like the GTA Games deal with sony but now grapichs card, makers are getting in to the mix now so some games will have NVidia Logos mostly on theres games. To make there games run better on there card and another game will have the ATI logos on it to make, that game run better will ATI better over then NVidia.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: TVRFan
Remember Doom 3 has the NVidia the way to play it logo and HL 2 has the Ati logo on it. Its just like the GTA Games deal with sony but now grapichs card, makers are getting in to the mix now so some games will have NVidia Logos mostly on theres games. To make there games run better on there card and another game will have the ATI logos on it to make, that game run better will ATI better over then NVidia.

Not really. TR: AoD has an nVidia logo, but it plays much better on ATi hardware. UT2K3 also has the TWIMTBP logo, and it plays about the same on both ATi and nVidia. Those logos are mainly marketing alliances with the producers, and generally don't affect developers that much (other than maybe providing devs with free development hardware). I mean, I don't think a developer will purposefully make one brand slower than another, as it will hurt the developer's sales as much as it may increase the hardware maker's.