3D Center Doom 3 Nvidia Optimizations

jrphoenix

Golden Member
Feb 29, 2004
1,295
2
81
Hardocp had a link to this story I had read and now revised (read the bottom): Optimizations.

I think optimizations are good if image quality is the same and performance goes up. I was curious if anyone knows if anyone has benchmarks showing these optimizations & ATI optimizations turned off to see how the hardware really performs?
 
Jan 31, 2002
40,819
2
0
I think I'm sick of people moaning about "optimizations" :p

Not a flame against you, just in general. Looks like its time for me to buy a Matrox and flip both respective camps the bird.

- M4H
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
I guess the only good news about this, is that ATI users can expect good performance in any future games that use the doom3 engine. :beer:
 

MechxWarrior

Senior member
Mar 2, 2004
565
0
76
Originally posted by: ronnn
I guess the only good news about this, is that ATI users can expect good performance in any future games that use the doom3 engine. :beer:

I already get good performance with my 9700 at 1024x764 at high quality...almost never drop below 35-40fps. I really dont know what all the fuss is about. People must really want that AA, although I rarely see aliasing in the game due to the darkness.

But yes, it is good. Competition is always good.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I see nothing wrong with optimizations provided there isnt a large degrade of IQ. If the screen is 95% accurate chances are I wont see the 5% that is wrong. But if it yields me 20-30% increase in performance bring it on.

 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
imo too much is made of optimizations.. would you prefer all your games to run "unoptimized"?

there have always been optimizations within drivers, and there always will be. one company will always try and outdo the other. in defense, it's a bit difficult to have one particular code run on all different combinations of hardware & software at 100% efficiency.

as long as it doesn't comprimise image quality (or allows the user a choice to comprimise IQ for performance) or claim to do something it doesn't, what does it matter? i'd certainly like all games run at maximum efficiency on my system, regardless of whose video card i happen to have at the time.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I'm unhappy with application detection & shader substitution in general but unfortunately it looks like it's here to stay. The next best thing is to allow the user to disable it, which ATi have done.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: BFG10K
I'm unhappy with application detection & shader substitution in general but unfortunately it looks like it's here to stay. The next best thing is to allow the user to disable it, which ATi have done.

see, that doesn't make sense to me. i don't think it's unfortunate at all.

while i'd certainly agree if such optimizations were detremental to the image quality (and i agree the ability to enable/disable said optimizations are a good thing), if that were not the case, what difference does it make? who cares if there are specific codepaths, as long as the quality is what the developer envisioned, and the performance is beneficial to the end user? isn't the entire point to get the most performance and best quality we possibly can with the equipment we have?

the only logical reason i could imagine why someone would be unhappy over htis is in those cases where brand rivalry is more important than the performance we get. it shouldn't matter which brand is faster, what should matter is we get the best out of whatever we have, regardless of brand.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The next best thing is to allow the user to disable it, which ATi have done.

How can I on the R9800? The horrific filtering in D3 has been getting on my nerves(bilinear for the first mip transition always). Which driver revsion do I need to disable their app detection there?
 

Marsumane

Golden Member
Mar 9, 2004
1,171
0
0
Originally posted by: CaiNaM
Originally posted by: BFG10K
I'm unhappy with application detection & shader substitution in general but unfortunately it looks like it's here to stay. The next best thing is to allow the user to disable it, which ATi have done.

see, that doesn't make sense to me. i don't think it's unfortunate at all.

while i'd certainly agree if such optimizations were detremental to the image quality (and i agree the ability to enable/disable said optimizations are a good thing), if that were not the case, what difference does it make? who cares if there are specific codepaths, as long as the quality is what the developer envisioned, and the performance is beneficial to the end user? isn't the entire point to get the most performance and best quality we possibly can with the equipment we have?

the only logical reason i could imagine why someone would be unhappy over htis is in those cases where brand rivalry is more important than the performance we get. it shouldn't matter which brand is faster, what should matter is we get the best out of whatever we have, regardless of brand.

Or how about what John Carmack said? If he makes a perticular change to the game, then it may "break" a card's performance in that game if it is dependant on the old-style way of doing things in a previous patch.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
That is a big "if" and the IHVs work closely with people like John Carmack to know what he is doing and if it breaks the drivers before it comes out.

I mean for the past 18 months or more we have had Nvidia doing this and I cant think of a single game where this scenario played out.