• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Doom 3 Benchmarks at [H]

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Someone said that they game as a whole would play better than this demo. That is both true and false. Although they did say this demo was especially hard on the gfx cards, it was not on the computer. No AI, no physics, just replay. He stated that too. CPU down = gfx down. So this might actually reflect true results once both are factored in.
 
Originally posted by: Safeway
Someone said that they game as a whole would play better than this demo. That is both true and false. Although they did say this demo was especially hard on the gfx cards, it was not on the computer. No AI, no physics, just replay. He stated that too. CPU down = gfx down. So this might actually reflect true results once both are factored in.

or at least until the cpu is no longer a bottleneck. 😉
 
I guess i'll just have to suffer with a lower level of graphics using my 9700. There is no way i'm buying a $500 video card to play a $59 game. I doubt I will be standing in one spot durring the entire game to take in the beauty of my surroundings anyway's :shocked:
 
I'm not sure this is correct but I'm sure you guys will let me know.

DOOM3 is based on OPENGL which NVIDIA has completely ruled over ATI or any other card.

The next game that comes out that uses the D3D API will have ATI Cards spanking NVIDIA and vice versa.

I chose the NVIDIA BFG 6800GT because of NVIDIAs hardware encoder for digital video editing.

I can see it now, everyone is going to run out and buy a NVIDIA 6800GT and the next game that is benchmarked running a different API will have the ATI cards in the lead.

I miss the simple days of using a NVIDIA 32MB TNT with a Voodoo 2 12MB x2 SLI pass through for the OPENGL and GLIDE.
 
ATI no longer spanks Nvidia in D3D. The new Geforce 6 series actually wins a lot of the time.

As for that hardware encoder no one really has an ETA as to when the drivers will arrive to enable that.

Gosh Voodoo 2's were the sh!t back then. I remember the special 3dfx Glide modes that really made the games look nice.

-Kevin
 
You miss the simple days Curley? Excuse me, but how does having 3 grphx. cards make life simpler. 😀
Note: Voodoo2 SLI is awesome:thumbsup:
 
Pretty amazing results for Nvidia. I was expecting to see something like this, but I was mostly surprised by the difference.
Looks like my 9800pro is doing just fine as I expected. Can't say the same for X800XT and pro.
Well I got a very nice deal for my 9800pro and I already ordered my 6800GT. 😀
It will be here at August 3-5.
 
Originally posted by: Cat
I'm not denying that the driver is doing something; I'm just correcting what you posted. I'm more concerned about the spread of misinformation than who looks good or bad.

Well, what I was posting about the lighting and physics engine, was based more on speculation and the little bit of information that has been leaked out so far about the engine, not on fact. If anyone wants to post absolute fact on those issues, I'm all ears. 🙂 When the game is out, we'll all see though.

Originally posted by: Cat
The Catalyst drivers are bigger than the Forceware drivers; make what you will of that.
Forceware 61.76: 12 MB
EDIT: International version is 20 or so MB.
Catalyst 4.8: 28 MB

FWIW, that's the "full" Cat release, including their media-player software, and a ton of other stuff (inc. internationalization). The actual driver-only binaries are only around 6-7MB. So NV's drivers do weigh in at nearly twice the size of ATIs.
 
Originally posted by: Curley
DOOM3 is based on OPENGL which NVIDIA has completely ruled over ATI or any other card.
Speaking of which, that's a nice segue for me to post

this link

It's a benchmark of a professional 3D authoring application, using a bunch of "workstation-class" cards, both NV and ATI and a 3DLabs Wildcat thrown in besides, along with a Radeon 9700 Pro and 9800 Pro "gaming card". The benchmark results were "interesting", to say the least.

Now, it may be possible that the benchmarking methodology was flawed, since this seemed to be the first benchmark done with that app by the site.

But to see a Quadro2 MXR (which, correct me if I'm wrong, is based on the same GPU as the GF2 MX), absolutely STOMP a Radeon 9800 Pro... well, that's something.

It also mentions drivers. The "professional" 3D cards were tested with a different driver set than the "gaming" cards.

So here's my question - would people see a much bigger performance increase in OpenGL-based games like Doom3, if they modded their Radeon cards into FireGL boards, and used that driver set (intended for professional OpenGL applications) instead? Something to ponder. I know that people seem to think that "ATI sucks at OpenGL", but that's not entirely true. ATI bought out some other company, and one of their assets was their tweaked OpenGL driver set. However, that is still offered seperately, AFAIK, and only for their "workstation-class" cards like the FireGL series, even though the GPU in question is basically the same as the "gaming" Radeon cards. Whereas, NV's driver set (I think), is "integrated", both their professional OpenGL and consumer DirectX drivers are all in one package. (Again, I think.. hmm, that could explain why NV's driver set is so big compared to ATI's then. Maybe I solved that mystery. 🙂 )
 
The workstation drivers typically accelerate things you don't see in games: anti-aliased lines, wireframe, and two-side/fixed-function lighting.

I haven't used our ancient Quadros in some time, but the NVidia website has a workstation category for their drivers. It looks like they're completely separate. I've noticed that the workstation drivers are typically far behind the gaming drivers, so I've never had a chance to benchmark the two.

You may be the most reasonable/personable poster in Video. 🙂
 
Originally posted by: Cat
The workstation drivers typically accelerate things you don't see in games: anti-aliased lines, wireframe, and two-side/fixed-function lighting.

I haven't used our ancient Quadros in some time, but the NVidia website has a workstation category for their drivers. It looks like they're completely separate. I've noticed that the workstation drivers are typically far behind the gaming drivers, so I've never had a chance to benchmark the two.

You may be the most reasonable/personable poster in Video. 🙂
I've been enjoying both of your posts of late. I get an opportunity to actually learn something beyond who has the biggest self-esteem issues in the video forum, so thanks to you both :beer: You coders are still a weird bunch though 😛
 
Originally posted by: bcoupland
You miss the simple days Curley? Excuse me, but how does having 3 grphx. cards make life simpler. 😀
Note: Voodoo2 SLI is awesome:thumbsup:


I was joking.
 
Originally posted by: VirtualLarry
So here's my question - would people see a much bigger performance increase in OpenGL-based games like Doom3, if they modded their Radeon cards into FireGL boards, and used that driver set (intended for professional OpenGL applications) instead?

That's a very interesting question. In my experience, when I modded my R9700PRO into a FireGL card QuakeIII actually ran *slower*. On top of that, many of the driver options found in the desktop drivers were missing. I don't think I had access to AF, or any of the texture quality sliders.

I have yet to find a way to mod the 9700PRO so that I can use the latest FireGL driver set; I always get stuck using drivers that are like a year old which makes them almost useless in gaming situations.

There is a chance that the FireGL drivers will make the ATi cards much faster, but there is an even greater chance that they will cause people to be unable to even play DoomIII. On top of all this, nVidia's workstation drivers are miles ahead of ATi's; a GF4 will easily beat a FireGL X1 in just about every workstation benchmark, sometimes by a huge margin.
 
Any idea when graphic cards will be available that will can handle this game easily?

None exist yet according to those benchmarks.

With past Carmack games....it's usually the generation of cards following a new engine release.
 
I don't know, i think all the cards 9800pro and up, did pretty well. at 1024, they are all up over 40fps. the next generation is probably going to be suped up cards improving performance by maybe 25-50%. I think it'll take some new architecture to pump out 60fps minimums with maximum everything (1600, 16af, 8AA, high quality textures).
 
Originally posted by: gururu
I don't know, i think all the cards 9800pro and up, did pretty well. at 1024, they are all up over 40fps. the next generation is probably going to be suped up cards improving performance by maybe 25-50%. I think it'll take some new architecture to pump out 60fps minimums with maximum everything (1600, 16af, 8AA, high quality textures).


I wouldn't call what we've seen so far happy news for most of us. Those fps were pretty low on the 9800XTs and 5950s. When you consider they might be lower yet on 128MB 5900/9800s, or even 128MB 6800s, I'd say you may need a current gen card to play this well.

As I said, it will be interesting to see if the 12 pipes and/or 128MB put the smack down on my 6800NU. Benches forthcoming.
 
Originally posted by: Ferocious
Any idea when graphic cards will be available that will can handle this game easily?

None exist yet according to those benchmarks.

With past Carmack games....it's usually the generation of cards following a new engine release.

I would say R500 and NV50 should completely manhandle DoomIII, but obviously that's just speculation on my part. I'm waiting for that generation and I feel the latest cards are insufficient for the next round of games. My next monitor is going to be a Dell 2001FP, so I need something that can do 1600x1200 in all the games I'm going to play. Even the 1280x1024 of my current screen would be a bit of a stretch for a 6800U in DoomIII, let alone HL2.
 
Originally posted by: Ferocious
Any idea when graphic cards will be available that will can handle this game easily?

None exist yet according to those benchmarks.

With past Carmack games....it's usually the generation of cards following a new engine release.
Depends on you r def of "handle" 😛

i say the 9800XT and 5950u do a FINE job of 10x7 HiQ NoAA/AF, and a fair job (by extrapolation) of 10x7 HiQ and 2xAA/4xAF (barely acceptable with 4xAA/8xAF) . . . from HardOCP's benchs . . . remember the Demo is generally MORE demanding than the game. 😉

if you want to max everything at 16x12 "now", you're gonna need 2 6800us in SLI; or wait till nv50/r500.

😉

i am quite happy at 10x7 . . . sure, i'd like 11x8 . . .

:roll:
 
in 1 demo that is extremely demanding. more so than the game.
Nobody cares if something is running at 200 FPS or 300 FPS. The sections where performance drops is what people want to know about.

mr c clearly stated the game will play better than the demo runs.
Marginally I'd expect.

prolly never dips below 30fps.
30 FPS is a slideshow.

So NV's drivers do weigh in at nearly twice the size of ATIs.
ATi's drivers don't include a control panel though, unlike nVidia's. Add another 12 MB for the control panel.
 
Originally posted by: SickBeast
Originally posted by: Ferocious
Any idea when graphic cards will be available that will can handle this game easily?

None exist yet according to those benchmarks.

With past Carmack games....it's usually the generation of cards following a new engine release.

I would say R500 and NV50 should completely manhandle DoomIII, but obviously that's just speculation on my part. I'm waiting for that generation and I feel the latest cards are insufficient for the next round of games. My next monitor is going to be a Dell 2001FP, so I need something that can do 1600x1200 in all the games I'm going to play. Even the 1280x1024 of my current screen would be a bit of a stretch for a 6800U in DoomIII, let alone HL2.

That's what it seems to be more or less the case.
If you want 16x12 with all eye candy on then you'll have to wait for NV50/R500.
On the one hand this is good because we expect quality games after all.
But then again I wonder what's going on. If you can't max a game that had been worked for so long then things can only get worse with the time passing by.
 
The engine, much like the quake 3 engine, is designed to be useful for a long time into the future.

I dont know why everyone is suprised that you cant run it at max settings, in a year and a half youll be playing games based on the same engine at double or triple what youre getting now... 6800s will be like geforce 3s.
 
Back
Top