No way this is 6x AA

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
This *will* be my last reply to you *unless* you can *prove* point number 3 below conclusively.

1. COD benchmark. I don't play multiplayer, only single-player (I don't play multiplayer anything). Fraps shows I average 42 fps very consistently. Worst dip was 35 fps. Second point 33 fps is *not* a slideshow. slideshows IMO are 15 fps and below.

2. Controlling shader subsitituion. It's a bit hard to turn off optimizations that don't even exist (except in ATi drivers).

3. You need to prove that this shader substition it taking place (good luck with that, I'll be waiting with bated breath for your response). A hint for you: instruction reording by the drivers HLSL compiler does NOT constitute shader substitution...
 
Jun 14, 2003
10,442
0
0
Originally posted by: Gstanfor
ATi wrote the shaders in HL2 for Valve, which neatly explains why HL2 won't run in DX9 mode on a GeForceFX ... unless you use this

It would appear that ATi also coded the AA settings for Valve as well.

One has to wonder how much else of Valves game was rewritten by ATi (anyone who has played Vampire: Bloodlines knows that the source engine initally had no problems at all using FX cards in DX9 mode).


i dont think you can say that first part. for starters the FX series flatout bombed in DX9, it simply couldnt hack it, it cant hack much DX9

i think in the end everytime DX9 was presented to a FX card it defaulted to DX8.1 no matter what.

anand even did a juicy article (at least i think it was on anandtech) right about when NV40 arrived about why the NV3x couldnt do DX9....he explained it all. dig it up its a good read
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Rubbish.

The GeForceFX was quite capable of running the ATi Ruby demo (something the 9700/9600/9500 could not do without shader modifications) and 3DMark05, both definitely DX9.

It also plays Bloodlines, HL2 (with the mod I mentioned), farcry, etc in DX9 mode, it just lacks high DX9 performance.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Second point 33 fps is *not* a slideshow.
Yes it is. Also CoD runs far faster than either Unreal 2 or Chrome so one can only imagine what sort of scores you're getting in those games.

slideshows IMO are 15 fps and below.
Wow. Just...wow. It's amazing what people will say to back their vendor of choice. Amazing - now we don't even need TV/movies FPS!

It's a bit hard to turn off optimizations that don't even exist
:roll:

You need to prove that this shader substition it taking place (good luck with that, I'll be waiting with bated breath for your response).
There are literally dozens of articles and pieces of information all over the web from developers, hardware websites and regular posters showing evidence of this going on. Of course you seem to be intent on rewriting history and wasting peoples' time in the process.

Anandtech:
This is quite a big deal in light of the fact that, just over a year ago, thousands of shaders were stored in the driver ready for replacement on demand in NV3x and even NV4x.

3DCenter
Nvidia seems to have used optimizations in their drivers, similar to those which already helped to win Q3A benches for quite a long time now. The driver does intercept and replace CPU demanding calls with less demanding ones. Special optimizations probably favor both, the Q3A and Doom 3 engine. In addition, Nvidia could have shown off quite some creativity in terms of replacing shaders. The company seems to have found a solution that allows for replacement of shaders without affecting image quality.

3DMark Cheat Audit (just one of many instances):
In game test 4, the water pixel shader (M_Water.psh) is detected. The driver uses this detection to artificially achieve a large performance boost - more than doubling the early frame rate on some systems. In our inspection we noticed a difference in the rendering when compared either to the DirectX reference rasterizer or to those of other hardware. It appears the water shader is being totally discarded and replaced with an alternative more efficient shader implemented in the drivers themselves. The drivers produce a similar looking rendering, but not an identical one.

Do I also need to prove the NV3x has poor shader performance as well? You seem to be having trouble accepting this too and are attempting to rewrite history by telling us what blazing performance you were getting on your 5900 despite tens of thousands of benchmarks on the web proving you wrong. Likewise for your ridiculous "8xAA is viable" claims.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Reduced to dredging up ancient history are we BFG10K? What does any of this have to do with recent drivers?

There have been no shader replacements since early on in the 6x.xx series drivers. The claim that Q3A uses shaders (as we know them in the D3D/OGL context) is particuarly hilarious.

ATi's *current* drivers have shader replacement code in them. nVidia's don't and haven't since they left the 5x.xx series behind (and yes, GeforceFX cards work just fine with recent drivers).

In Australia, the PAL TV standard has a 25 fps frame rate. So long as the frame rate is smooth and above30 fps I don't have a problem. It's when the framerate dives dramatically below 30 fps and stays there a good percentage of the time that slideshows occur.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
The claim that Q3A uses shaders (as we know them in the D3D/OGL context) is particuarly hilarious.
No such claim was ever made.

There have been no shader replacements since early on in the 6x.xx series drivers.
nVidia's don't and haven't since they left the 5x.xx series behind (and yes, GeforceFX cards work just fine with recent drivers).
Now it's your turn: show me evidence that proves this. Put up or stop spouting your opinion as fact.

In Australia, the PAL TV standard has a 25 fps frame rate. So long as the frame rate is smooth and above 30 fps I don't have a problem.
(1) A 30 FPS average will not stay above 30 FPS as that's not the definition of an average.
(2) TV has absolutely nothing to do with 3D games.
(3) This has what to do with your comment "slideshows IMO are 15 fps and below"?
(4) You aren't going to be getting anywhere near 30 FPS minimum in Chrome or Unreal 2 as CoD runs a lot faster than either game and your card isn't even close to managing it there.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
With regard to 3QA, your quote stated:

Nvidia seems to have used optimizations in their drivers, similar to those which already helped to win Q3A benches for quite a long time now. The driver does intercept and replace CPU demanding calls with less demanding ones. Special optimizations probably favor both, the Q3A and Doom 3 engine. In addition, Nvidia could have shown off quite some creativity in terms of replacing shaders. The company seems to have found a solution that allows for replacement of shaders without affecting image quality.

With regard to proving my claims about nVidia's drivers, I've previously told you about 3danalyze, not to mention that if nVidia were still replacing shaders ATi PR would be trumpeting such optimizations all over the web. We would also have articles on it, which we don't.

As for"stopping" me from stating facts (not opinions), you have a snowflakes chance in hell of succeeding there. Dave Baumann hasn't stopped me, ATi employees haven't stopped me, fanATics haven't stopped me and you sure as hell wont either.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
With regard to 3QA, your quote stated:
That quote is not implying shader substitution, it's simply highlighting application specific optimizations being present.

I've previously told you about 3danalyze,
Refresh my memory, what exactly did this achieve?

As for"stopping" me from stating facts (not opinions), you have a snowflakes chance in hell of succeeding there. Dave Baumann hasn't stopped me, ATi employees haven't stopped me, fanATics haven't stopped me and you sure as hell wont either.
How lovely for you.

So to recap your current position:

  • No evidence to prove ATi wrote the HL2 shaders.
  • No response to your blatant contradiction of playable framerate.
  • No evidence to demonstrate the viability of 8xAA in modern games.
  • No evidence to back your claims that nVida have ceased to perform all shader substitution.
If it makes you feel any better you can continue to produce steaming piles as much as you like, I'll just be right here to shovel it away.
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
And you have no evidence to prove they still do. No one has evidence....so all you do is call eachother idiots...

Why is it all the flamers go to the video forums????