ID recommends Geforce FX for doom III

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Originally posted by: reever
Carmack says that there is no discernable quality difference between FP16, FP24 and FP32 for DooM3. You have to say Carmack is wrong or say that ATi is faster running medium quality then nVidia is running high quality to validate the above statement.

And do you know for a fact whether what carmack said is right/wrong?

LOL. Best post in video forums EVAR.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
My understaind is the FX series was designed well for Doom3.
Stencil ops are apparently very big in Doom3 and the FX series runs circles around the R3.xx cards in this.

Also there are some nice custom NV3.x cards can run that Carmack wanted. People say he wrote the backend for the NV3.x cards because he had to. My impressions from reading places like Beyond3d's msgboard is he wrote the backnend because the NV3.x provided everything he wanted.

We have known for some time the FX series would probably be faster than the R3.xx in Doom3. I have heard rumors ranging from a reversal of the HL2 benchies from Shadey Days too a 9800 Pro barely beating a 5200 Ultra.

Only time will tell on this. If the NV40 turns out to be as powerful as the rumors are saying. It could be very skewed in Nvidia's way.

And do you know for a fact whether what carmack said is right/wrong?

ROTFLMAO! I guess we will just have too assume Carmack knows what he is talking about on this. It is afterall him who is writing the game and has a hands on everyday with it.
 

ScrewFace

Banned
Sep 21, 2002
3,812
0
0
I think even my AiW Radeon 9700 Pro is gonna be alot faster than that garbage GeForce FX 5200 in ANY game including Doom III. C'mon now, maybe nVidia is a little faster in OpenGL but The 9700/9800 will run Doom 3 with all eye candy turned on just fine.:beer::)
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Ben, I mean that ATi can process just as many colorless pixels (zixels ) as nVidia per clock (eight in the high end, four in the midrange).

But they lack US meaning they would have to do more stencil ops then nVidia to start with(why I stuck to the NV35), then they would have to deal with clock frequency gaps and there is another factor also. While nVidia has their 4x0 pipes working on stencil ops none of the pipes with shader units are tied up. For each pipe ATi is using for stencil ops, their shader performance is dropped by 12.5%(versus the baseline of 100% of course). As a generalized example(obviously not quite this simple) if ATi is using half their pixel pipes for stencil ops they then have their MTexel and MPixel rate cut in half along with their shader performance cut in half. nV can dedicate four pixel pipes to stencil ops and take no hit for any of the previously mentioned.

nVidia should be faster in Doom3 and by a decent margin also. All of the talk of nVidia's 'big mistake' in the NV3x architecture should see a bit of re-examination if their parts perform as they should in D3 compared to the competition. If D3 is followed relatively closely by Quake4(which we know has been in development for some time) D3 powered games could be neck and neck with PS2.0 heavy games by the end of the year ;)

And do you know for a fact whether what carmack said is right/wrong?

*blink*
 

Alptraum

Golden Member
Sep 18, 2002
1,078
0
0
Originally posted by: ScrewFace
I think even my AiW Radeon 9700 Pro is gonna be alot faster than that garbage GeForce FX 5200 in ANY game including Doom III. C'mon now, maybe nVidia is a little faster in OpenGL but The 9700/9800 will run Doom 3 with all eye candy turned on just fine.:beer::)

Lmao, we don't know that. A 9800pro could run it great, or not so great. Same for any current NVidia card. Its all speculation untill it comes out. Though from what I have seen/heard NVidia will do a better job with the game then ATI. But untill the game comes out its all moot and saying things like
The 9700/9800 will run Doom 3 with all eye candy turned on just fine.
is pretty pointless. You have no idea, and neither do I.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: keysplayr2003
Originally posted by: Insomniak
Originally posted by: Rollo
Id Software has always been fans of fat sacks of cash money.
Who isn't?

Exactly. This one has 'marketing" written all over it people. Don't get me wrong - all the Benchies seem to show that GFFX cards ARE better for Doom 3, but this is still about financial gain, not educating the customer. You don't see "Nvidia Reccomends ATI hardware to run the Dawn Demo" stickers about for some odd reason...

You are putting way too much effort and emotion into this. Put your picket signs away please, and whatever you do, don't go on a hunger strike.

What? Where exactly have you seen "effort and emotion?" As I recall, I was "stating the established facts". God knows, we can't have that.
rolleye.gif
 

g3pro

Senior member
Jan 15, 2004
404
0
0
i thought it was 'is, ea, id' as in the latin. :D

(3rd person pronoun for the unlearned :p)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Insomniak
Originally posted by: keysplayr2003
Originally posted by: Insomniak
Originally posted by: Rollo
Id Software has always been fans of fat sacks of cash money.
Who isn't?

Exactly. This one has 'marketing" written all over it people. Don't get me wrong - all the Benchies seem to show that GFFX cards ARE better for Doom 3, but this is still about financial gain, not educating the customer. You don't see "Nvidia Reccomends ATI hardware to run the Dawn Demo" stickers about for some odd reason...

You are putting way too much effort and emotion into this. Put your picket signs away please, and whatever you do, don't go on a hunger strike.

What? Where exactly have you seen "effort and emotion?" As I recall, I was "stating the established facts". God knows, we can't have that.
rolleye.gif

No such thing as "facts" anymore. All your posting is someone elses perception of a fact. We can debate 'til the end of time, but how boring would that be?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
If nothing else, this thread was worth reading for Reevers question as to whether or not we can trust this "Carmack" fellow to be an accurate source of Doom3 information.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: BenSkywalker
The FX's had to have their own path written.

Not quite. nVidia gave Carmack all the explicit features and extensions he wanted for DooM3 and he used them. Carmack has been talking about the DooM3 engine since back in the GeForce1 days and you can go check his quotes from then yourself. He wanted 64bit color(AKA FP16) he wanted really fast register combiner ops and he wanted loads of stencil op power- the FX architecture with nV's proprietary extensions.

When playing in the standard ARB path, the Radeons are much faster.

When playing the ARB2 path the big difference for the nV parts in terms of performance is that they are running in FP32 vs FP16(talking about the NV35 parts here). ATi's boards are running a 'lower quality' setting by using FP24. Carmack says that there is no discernable quality difference between FP16, FP24 and FP32 for DooM3. You have to say Carmack is wrong or say that ATi is faster running medium quality then nVidia is running high quality to validate the above statement.

The fact still is, doing a default path, ATi is faster. In HL2 and Doom3, the FX's had to have extra code written for them to get playable frames. To me, thats sad.

"Lower quality" as you say is what DX9 calls for. FX's use FP16 all the time instead of FP32, just to get better frames. Yeah, lets tout FP32, then not use it. :rolleyes:

If you cards have to keep getting major help from the devs, then thats a sign that it wasnt a very well designed product to me. Look at Farcry, the NV3x doing just PS 1.1, while the R3x does 1.1 and 2.0. And when you force 2.0 via a 3rd party software, the FX's CRAWL.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Rollo
If nothing else, this thread was worth reading for Reevers question as to whether or not we can trust this "Carmack" fellow to be an accurate source of Doom3 information.

hahahaha :beer::D
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Ackmed
Originally posted by: BenSkywalker
The FX's had to have their own path written.

Not quite. nVidia gave Carmack all the explicit features and extensions he wanted for DooM3 and he used them. Carmack has been talking about the DooM3 engine since back in the GeForce1 days and you can go check his quotes from then yourself. He wanted 64bit color(AKA FP16) he wanted really fast register combiner ops and he wanted loads of stencil op power- the FX architecture with nV's proprietary extensions.

When playing in the standard ARB path, the Radeons are much faster.

When playing the ARB2 path the big difference for the nV parts in terms of performance is that they are running in FP32 vs FP16(talking about the NV35 parts here). ATi's boards are running a 'lower quality' setting by using FP24. Carmack says that there is no discernable quality difference between FP16, FP24 and FP32 for DooM3. You have to say Carmack is wrong or say that ATi is faster running medium quality then nVidia is running high quality to validate the above statement.

The fact still is, doing a default path, ATi is faster. In HL2 and Doom3, the FX's had to have extra code written for them to get playable frames. To me, thats sad.

"Lower quality" as you say is what DX9 calls for. FX's use FP16 all the time instead of FP32, just to get better frames. Yeah, lets tout FP32, then not use it. :rolleyes:

If you cards have to keep getting major help from the devs, then thats a sign that it wasnt a very well designed product to me. Look at Farcry, the NV3x doing just PS 1.1, while the R3x does 1.1 and 2.0. And when you force 2.0 via a 3rd party software, the FX's CRAWL.

Look at horizons, 15fps at 800x600 all low settings on a 9800XT. Pixel shader that :)
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Ackmed
Is that the best you can do?

Really i could do better, but to argue the same 15 things over and over in a thread is getting pretty old for me.

Even if someone does win the "debate" if you will, it just goes again tomorrow with some other fanboi. So why bother? Exactly.
 

reever

Senior member
Oct 4, 2003
451
0
0
Originally posted by: Rollo
And do you know for a fact whether what carmack said is right/wrong?

LOL Reever. We have to assume Carmack is "right" because it's his program. Unless you'd rather take a PR guy at ATI's word for it?

Ugh, just ignore that post, I thought you meant that carmack also said that ati is the one running medium quality and nvidia is running high quality

Look at horizons, 15fps at 800x600 all low settings on a 9800XT. Pixel shader that

I can make Ultima Online, another MMORPG bring my computer a crawl, and it's not even 3d, MMORPG's are hard to get static comparisons on who is faster
 

fsstrike

Senior member
Feb 5, 2004
523
0
0
I saw some benchmarks with a nv3x vs 3dfx monster and the monster pwned nv3x. I also heard that the TNT2 is made specifically for farcry.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
The fact still is, doing a default path, ATi is faster. In HL2 and Doom3, the FX's had to have extra code written for them to get playable frames. To me, thats sad.

Do you understand why Carmack made a seperate path for the NV3.x cards? It wasnt to get them "playable" frame rates. If he wanted "playable" frame rate he would have used his how ARB2 path at a lower precision and sent the engine out where both cards are basically running neck and neck.

Instead the NV3.x GPU offered many custom extensions Carmack has wanted for quite some time. He is gladly using them, and if all goes well the only card that will have "playable" frame rates wont be the NV3.x cards but the R3.xx cards.

"Lower quality" as you say is what DX9 calls for. FX's use FP16 all the time instead of FP32, just to get better frames. Yeah, lets tout FP32, then not use it. :rolleyes:

Could you really tell the difference without somebody spoonfeeding you? ::rollseyes::

If you cards have to keep getting major help from the devs, then thats a sign that it wasnt a very well designed product to me. Look at Farcry, the NV3x doing just PS 1.1, while the R3x does 1.1 and 2.0. And when you force 2.0 via a 3rd party software, the FX's CRAWL.


So what you are telling us is the A64, P4, SSE2, SSE, MMX, 3dnow, and SSE3 are all poor designs?

ya, we understand now ::rollseyes::

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
We have to assume Carmack is "right" because it's his program.
Exactly like Gabe, yet Gabe is slammed as being a PR monkey for ATi.

Interesting.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: BenSkywalker
But they lack US meaning they would have to do more stencil ops then nVidia to start with(why I stuck to the NV35), then they would have to deal with clock frequency gaps and there is another factor also. While nVidia has their 4x0 pipes working on stencil ops none of the pipes with shader units are tied up. For each pipe ATi is using for stencil ops, their shader performance is dropped by 12.5%(versus the baseline of 100% of course). As a generalized example(obviously not quite this simple) if ATi is using half their pixel pipes for stencil ops they then have their MTexel and MPixel rate cut in half along with their shader performance cut in half. nV can dedicate four pixel pipes to stencil ops and take no hit for any of the previously mentioned.
Now you're talking a bit over my head. You're saying nV can shade pixels at the same time that they're processing zixels? That seems pretty complex, perhaps too much so for a card made to play every game out there, not just D3. I'm also not sure how you can say ATi gets its fillrate cut in half, as it's actually working, not just sitting idle. As NV30/35 has only four pixel pipelines, I'm not sure how they can dedicate all four of them to zixels and simultaneously shade four color textured pixels. Maybe I need to reread that 3DC article. :)
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: Pudgygiant
Isn't it "id", not "ID", as in "id, ego, and superego"? That's what I always thought.

Too much Freud can harm you... :beer:

Well some friend of mine played the beta leaked version in his 9800pro and he said everything was smooth silk....

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
The fact still is, doing a default path, ATi is faster. In HL2 and Doom3, the FX's had to have extra code written for them to get playable frames. To me, thats sad.
Ackmed, what are going to do in a couple of months if nVidia has the best gpu again? (since you seem to need to boost your ego by putting them down)