• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

What in the world is up with Nvidia?!?!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Originally posted by: BenSkywalkerGearbox made different rendering paths for each board- all of the boards run the paths equally well and all of them perform almost exactly the same no matter which camo setting you use.

I'm just saying that they are obviously using different paths so it is unreasonable to assume the benchmarks are comprable. Do you have benchmarks to substantiate your claim to the contrary?


Originally posted by: BenSkywalker
Interior levels in FC that means 640x480- and the FPS still fall well under reasonable levels on my R9800Pro.



And it would on an 6800u-ee as well, many spots on Far Cry are very cpu limited.
 
Painkiller

The average level is 350,000 polygons, and also incorporates advanced vertex and pixel shaders, including water, glass, volumetric lights and fog, etc.

Painkiller

Adrian: Pure miscalculation. We forgot we always add more and more cool stuff. We cannot let it go. It's a bad habit, but the game will benefit.


GH: At the expense of the technological edge?

Adrian: On the contrary. We're adding shaders like there's no tomorrow.




 
As I already pointed out, the 4AA/AF benches are not comparable with the lousy 4AA the FX card is doing, besides it?s Brilinear and new aggressive filtering ?

The 2x benches show the 5950 in the lead over the R9700Pro also where the AA is comparable. As far as texture filtering goes, the R9800Pro's is very poor at best to start with. Excessive aliasing due to bare minimum interpolation and off angles are still an issue. I understand what you are saying about the NV3X's filtering, but it is comparing two different types of very poor filtering and focusing on the weaknesses of one or the other. Besides all of that, if this game was remotely as shader intensive as some are indicating the R9700Pro should be killing the 5950 at all settings. You can find some benches where the 9600XT is besting the 5950 in shader limited situations- DFBHD is not very shader intensive no matter what the developers try and imply.

I'm just saying that they are obviously using different paths so it is unreasonable to assume the benchmarks are comprable. Do you have benchmarks to substantiate your claim to the contrary?

Posted them a while ago, I'd have to rerun them at this point although I honestly suggest for people just to use 3DAnalyze and see for themselves.

And it would on an 6800u-ee as well, many spots on Far Cry are very cpu limited.

I'm not talking about levels like the Volcano(when you are working you way up to it actually), just the interior levels in general. Dropping the shaders all the way down gives me very smooth framerates in these instances- something that the CPU is not responsible for. Actually, just checked the link you provided(the CB link) and they have some FC benches where the 9700Pro is just under 35FPS average running 1024x768 no AA/AF while the fastest parts are a bit under 90FPS.

Blastman-

Where are these shaders in Painkiller? I went out and bought the game because of how impressive people said it was, it looks dated at best.
 
My 5900XT, even overclocked, is showing its age. However, it plays my games just fine ATM, and I could care less about the score I recieve on a tech demo. By the time games are out that look like 3DMark 05, this card will be in my low end machine anyways.
 
Put it this way .... if your current selection of games (emphasis on the word current) does not include any pixel shader built into the game, then be happy with your 5900. I am, since I can't figure out how to play 3dMark on my PC .... WASD keys don't work.

Point being, this is a BENCHMARK, not a GAME. For what I am now playing, I enjoy my 5900. Next year, I won't be saying that when I'm playing newer games, but who cares ? I also won't be using that 5900 in a year on my main rig. 3dMark is designed to sell video cards ... why else do you see ATI and Nvidia supplying monetary 'gifts' to Futuremark ??
 
Originally posted by: BenSkywalker
Complete BS given the benchmarks on the web.

Why not link some of these benchmarks showing incredible framerates on interior levels in FarCry.

I don't consider 46FPS average anything to write home about. And that is far from worse case scenario(that's also only 1024x768 no AA/AF- add AA and AF and the average framerate drops below 30). You can see some benches like over at FS where they are pushing tripple digit averages of course- but that is on exterior levels and that is average, not minimum. Now, why don't you dig up some of these benches to back yourself up.

I believe research is an interior level.

You are correct, I should look more carefully. It is in fact 52FPS average in the bench they are using, still below a decent playable threshold(and certainly not worst case).
What exactly is a "decent playable threshold"?

Also, I posted this earlier in the thread: Do you think shaders will be important in the next 2 years?
 
I don't think Ben understands how average framerate can't be directly coralted with minimum framerate or he would not make such claims as to "decent playable threshold" from an averaged benchmark result. He also wouldn't be argueing with me about the how the cpu limited nature of Far Cry can keep a card like the 9800pro from maintaining a decent playable threshold.
 
Originally posted by: BenSkywalker

Where are these shaders in Painkiller? I went out and bought the game because of how impressive people said it was, it looks dated at best.
They must be hidden in the secrets. 😉 😀

If you read that blurb on DF-BHD from NV, even the faces on the characters were rendered using pixel shaders --so shaders are being used in a lot more places that the usual suspects like water, pipes, metal ..etc.

There was a very interesting comment at the end of an ATI interview at beyond3d ?.
beyond3d

Patric: I?d just like to add that shaders are not just for special effects like a water surface. This seems to be a common belief, since also games still use shaders mostly only for some special materials. I believe, and I?m sure the forward looking game developers agree, that shaders are here to replace the old fixed function vertex processing and multi-texturing. In the future all games will use shaders for all vertex and pixel processing. There may be fallbacks for pre-shader cards still doing multi-texturing etc, but the day will come when the first PC game is released that _requires_ powerful shader hardware and offers no fixed function fallback.


The canyon walls in 3Dmark5 are rendered using shaders ?
xbit

The rock surface of the canyon is one of the heavier materials in 3DMark05, filling up a PS2.0 shader almost to the last instruction when combined with the dynamic shadow rendering.

Shaders seem to be showing up for just about any ?effect? -- even texturing effects on walls. I don?t know how much FP (DX9) shaders are being used in some the new games like Joint Operations and MOH-Pacific Assault (if at all) but I would think a lot of the new games are starting to use DX9 class shaders -- even if to a limited degree.

 
LT-

I believe research is an interior level.

It starts outdoors and then goes in to the caves for a bit.

What exactly is a "decent playable threshold"?

No dips below 30FPS at any point- no dips sub 60FPS is much better however for any shooter.

Also, I posted this earlier in the thread: Do you think shaders will be important in the next 2 years?

They will be a factor but certainly not paramount. We are going to need to see new cores with signficantly more shader power then anything out now before they become a real major factor.

Snowman-

I don't think Ben understands how average framerate can't be directly coralted with minimum framerate or he would not make such claims as to "decent playable threshold" from an averaged benchmark result.

Most of the time you can use an average to get at least a rough idea of a minimum, although not always. Sub 60FPS average is certainly not within a decent range for playability for a shooter.

He also wouldn't be argueing with me about the how the cpu limited nature of Far Cry can keep a card like the 9800pro from maintaining a decent playable threshold.

Well let's take a look at some of the minimum framerate numbers in the XBit benches since the game is so incredibly CPU limited, of course all of the boards will have the same minimum framerate...... 1024x768 no AA/AF but there is a 12.9% rift there according to those benches at the most CPU dependant numbers they run. Now some games actually are CPU limited and will show minimum FPS running 1024x768 no AA/AF that are identical between a X800XT and a 5700Ultra.

He also wouldn't be argueing with me about the how the cpu limited nature of Far Cry can keep a card like the 9800pro from maintaining a decent playable threshold.

If the processor were the issue shutting off pixel shaders certainly wouldn't improve the situation significantly which is the case I'm talking about. If you would like to start a thread about benchmarking trends to the correlation between average and minimum framerates feel free- there is a correlative relation that has been established over the past eight years. It isn't always accurate(else it would be more then correlative of course) however.

Blastman-

If you read that blurb on DF-BHD from NV, even the faces on the characters were rendered using pixel shaders --so shaders are being used in a lot more places that the usual suspects like water, pipes, metal ..etc.

Depending on how you define pixel shaders you can state that the hack job of basic texture filtering the current gen parts are doing are shaders. Dot3 is an actual normal PS operation as of now, as is EMCM and EMBM. On a technical basis, even without taking liberties on the definition of words(which PR departments always do) you can state any number of DX7 games are using lots of pixel shader effects.

There was a very interesting comment at the end of an ATI interview at beyond3d ?.

.....I believe, and I?m sure the forward looking game developers agree, that shaders are here to replace the old fixed function vertex processing and multi-texturing.

The end point of pixel shaders is to replace textures(because they are superior and procedural), this is something we have known for years prior to there being shader capable dedicated hardware. The problem is all of the shader hardware is way too slow to start to pull it off. What we have now is simply laying the groundwork for what we need to get to.

The canyon walls in 3Dmark5 are rendered using shaders ?

Unfortunately this shows none of the major advantages of using shaders in place of textures.

Shaders seem to be showing up for just about any ?effect? -- even texturing effects on walls.

"Shaders" are showing up- the problem is that the PR has gotten way the he!l ahead of the technology. You could argue that enabling AF on GLQuake makes that game shader bound on some of the latest parts as they use some shader hardware for AF functions- and it seems this is the type of BS we are hearing from a lot of developers trying to pimp their games. We have functions introduced in DX6 that are nigh entirely reliant on shader hardware now, and can accurately be called a shader effect, but it is very far removed from the PR hype about the PS 2.0 revolution we have been hearing for the last couple of years. Look at BHD as an example- the 5950U is outrunning the 9800XT on the most demanding situations and that is supposed to be an example of a shader game? Using the PS 2.0 guideline we know the 9800XT absolutely kills the 5950- we have seen benches where the lowly 9600XT is besting the 5950 in honestly PS 2.0 limited situations. The reality is that we have a slowly evolving process that started back with EMBM in the DX6 era and continued on with Dot3 and EMCM in DX7 and continued on with DX8 level shaders which we are realisticly just starting to see any sort of real market penetration of now. PS 2.0 being a major factor in terms of games is still a ways off, and that isn't a bad thing. DooM3 for the most part just showed us what could be done with DX7 era 'shader' hardware and it is extremely impressive. We haven't seen what can be done with DX8 level shaders yet, DX9 level is pushing in to CGI territory once we have the hardware that can push it. That hardware is a long ways off yet.
 
LT-

quote:
I believe research is an interior level.



It starts outdoors and then goes in to the caves for a bit.

quote:
What exactly is a "decent playable threshold"?



No dips below 30FPS at any point- no dips sub 60FPS is much better however for any shooter.

According to anandtech, the research demo is indoors. Not only does a 9800 XT provide your 30 fps notch/60 fps avg. at 640x480, it also just about does it at 1280x1024 (over 4x the pixel count).
 
Yep, I played Far Cry on a 9800xt at 1024x768 with 2xAF and all the graphics optioins cranked; it ran just fine other than the second half of "Dam" which is bound to put a hurt on any system.

Ben, a game can have many cpu limted situations which determain minimum framerate without being cpu limited across the board, and Far Cry is an exelent example of such a game.
 
Has nigh no difference on the benches in terms of the camo path.
It doesn't? So what is the score on nVidia cards when rendering the camo properly? Besides, you have no idea what other things are different in that path.

Interior levels in FC that means 640x480- and the FPS still fall well under reasonable levels on my R9800Pro.
Assuming no CPU limitations that probably means 400x300 for the NV3x or the reduced precision 1.x path.

With the exception of D3 I would class them all as extremely to very light(FC being the only 'very light').
You are seriously mistaken if you think Far Cry is not a shader-heavy title. In fact if anything Doom3 should be really classed as light since shaders are really bolted on as an after-thought. In Far Cry there is very little on the screen that isn't touched by shaders. As for DEIW, it requires shaders to run (i.e. it won't even launch on non-shader hardware). And Thief 3 too for that matter.

Dropping the shaders all the way down gives me very smooth framerates in these instances- something that the CPU is not responsible for
That's interesting considering you class it as "very light" in terms of shader usage. So if it's "very light" there should be little/no difference in performance right?

Where are these shaders in Painkiller?
In Painkiller, dynamic per pixel lighting is used only in places where it adds to our own version/vision of how the real world should be presented onscreen. For example we always wanted soft shadows (both pre-rendered and real-time), in order to avoid the cg-ish plastic look of per pixel specular. Thus, per pixel lighting in our game is used only for things like dynamic lights (explosions, flashlight), player's weapon lighting, and a few other special pixel shader effects every now and then. It was not a question of technical limitation, just purely a design decision.

I went out and bought the game because of how impressive people said it was, it looks dated at best.
A shader isn't required to produce flashy effects before it's considered a shader; it could even be doing something as simple as multi-texturing. Also if you like we can discuss how some of Doom III's textures have an equal resolution to Quake 2 textures.
 
LT-

According to anandtech, the research demo is indoors.

Derek is wrong. Just start the game yourself.

Snowman-

Ben, a game can have many cpu limted situations which determain minimum framerate without being cpu limited across the board, and Far Cry is an exelent example of such a game.

Of course, simply point me to a bench of FC where the minimum framerate is the same for all the vid cards or at the very least the same for the faster boards. You are talking about extremely basic points; it's just the benches don't agree with you.

BFG-

It doesn't? So what is the score on nVidia cards when rendering the camo properly? Besides, you have no idea what other things are different in that path.

I wouldn't say nV is rendering the camo path properly when running it like ATi, it essentially is akin to the old Asus drivers that would let you see through walls(an in game cheat of sorts). Cloaking devices are supposed to make you invisible, not make you stand out more. For the bench scores check it yourself, it makes no real difference which path is forced in terms of performance(+/- 1-2FPS).

Assuming no CPU limitations that probably means 400x300 for the NV3x or the reduced precision 1.x path.

Or backing down the shaders that are there.

You are seriously mistaken if you think Far Cry is not a shader-heavy title.

No, I'm not. FC uses shaders for a very small portion of its on screen elements and those shaders that it uses outside of pipes and water are simplistic.

In fact if anything Doom3 should be really classed as light since shaders are really bolted on as an after-thought.

D3 uses shaders for almost every pixel in the entire game. I would say D3 is still light, as almost all of the shaders used are very simplistic, but it certainly uses shaders far more extensively then FC.

As for DEIW, it requires shaders to run (i.e. it won't even launch on non-shader hardware). And Thief 3 too for that matter.

Doesn't add up to a heavy shader load.

That's interesting considering you class it as "very light" in terms of shader usage. So if it's "very light" there should be little/no difference in performance right?

On any part with remotely decent shader performance that would be right. Unfortunately we don't have any of those yet.

A shader isn't required to produce flashy effects before it's considered a shader; it could even be doing something as simple as multi-texturing.

And that is the big problem with all of the PR hype around shaders. Anything that could be done by the Voodoo2 to me should never be considered a shader- but that doesn't stop people from trying.

Also if you like we can discuss how some of Doom III's textures have an equal resolution to Quake 2 textures.

As do some of HL2's, what does that have to do with shaders....?
 
Originally posted by: BenSkywalker
Snowman-

Ben, a game can have many cpu limted situations which determain minimum framerate without being cpu limited across the board, and Far Cry is an exelent example of such a game.

Of course, simply point me to a bench of FC where the minimum framerate is the same for all the vid cards or at the very least the same for the faster boards. You are talking about extremely basic points; it's just the benches don't agree with you.


It isn't the benches that don't argree with me:

Originally posted by: lordtyranus
I believe research is an interior level.

With the 1024x768 bench we see every card peformace wise between a 6800 ultra and a 9800pro all within ~5fps on the minimum framerate, take a slower cpu and you wind up with unplayable framerates even on the fastest cards.


Am I even going to see you admint you were wrong, Ben?
 
There is a ~12% varriation between what you claim are CPU limited numbers with identical setups. There is rarely that level of varriation between a P4 3GHZ and a P4 3.4GHZ. That bench is not processor limited. If the bench was in fact processor limited there wouldn't be that level of varriance period. If I show you benches where one vid card is pulling 55FPS and the other 61FPS are you honestly going to expect anyone to believe that the bench isn't vid card limited? That is the same as what you are doing now.
 
Originally posted by: BenSkywalker
There is a ~12% varriation between what you claim are CPU limited numbers with identical setups. There is rarely that level of varriation between a P4 3GHZ and a P4 3.4GHZ. That bench is not processor limited. If the bench was in fact processor limited there wouldn't be that level of varriance period. If I show you benches where one vid card is pulling 55FPS and the other 61FPS are you honestly going to expect anyone to believe that the bench isn't vid card limited? That is the same as what you are doing now.

In theory at least, a 3.4 ghz processor is 13% faster than a 3 ghz one.
 
In theory at least, a 3.4 ghz processor is 13% faster than a 3 ghz one.

Yes, and in a few cases you may be able to see that, but of course you said 'in theory' because we know that you very rarely actually see directly linear performance gains with clock speed.
 
Originally posted by: BenSkywalker
In theory at least, a 3.4 ghz processor is 13% faster than a 3 ghz one.

Yes, and in a few cases you may be able to see that, but of course you said 'in theory' because we know that you very rarely actually see directly linear performance gains with clock speed.

From what I understand, you are saying the bench is not processor limited due to a 12% difference? Since you agree that in a few cases the difference between the 2 processors reaches the theoratical 13% number, I don't think there is strong evidence that the bench is not processor limited. I'm not saying that is it, just that the evidence to the contrary isn't that concrete.

A test with an Athlon FX would help clear up this ambiguity.
 
Originally posted by: BenSkywalker
There is a ~12% varriation between what you claim are CPU limited numbers with identical setups. There is rarely that level of varriation between a P4 3GHZ and a P4 3.4GHZ. That bench is not processor limited. If the bench was in fact processor limited there wouldn't be that level of varriance period. If I show you benches where one vid card is pulling 55FPS and the other 61FPS are you honestly going to expect anyone to believe that the bench isn't vid card limited? That is the same as what you are doing now.

look Ben, there is 1fps difference between a x800xt and a 9800xt. Now please either tell us what qualities of the cards could be responsible for the framerate being so similar, or suck it up and admint that the benchmark is showing a cpu limited situation. If you can bring yourself that far to come to terms with reality, there may yet be hope in you understanding how a slower cpu would bring those framerates down to unplayable levels regardless of the card used.
 
look Ben, there is 1fps difference between a x800xt and a 9800xt. Now please either tell us what qualities of the cards could be responsible for the framerate being so similar, or suck it up and admint that the benchmark is showing a cpu limited situation.

If you can't figure it out by looking at the charts there really isn't much of a point talking to you. Look at all the scores, if you think it is processor limited then you should reevaluate your willingness to debate people about anything regarding this type of technology. 9600XT outrunning a FX5950Ultra both of which are being close to doubled by the more powerful boards.... must be the processor.

Edit-

Some more research benches from XBit showing the reasearch level. In these benches the difference between the CPU limted X800Pro and 9800XT has grown to 13.3% running 1024x768 with no AA/AF. You know, cuz it's so CPU limited and all.
 
Originally posted by: BenSkywalker
If you can't figure it out by looking at the charts there really isn't much of a point talking to you. Look at all the scores, if you think it is processor limited then you should reevaluate your willingness to debate people about anything regarding this type of technology. 9600XT outrunning a FX5950Ultra both of which are being close to doubled by the more powerful boards.... must be the processor.


No processor is going to make up for the fx series downfalls in that benchmark, but that don't change the fact that the test is cpu limited across more cappable videocards.



Originally posted by: BenSkywalker

Edit-

Some more research benches from XBit showing the reasearch level. In these benches the difference between the CPU limted X800Pro and 9800XT has grown to 13.3% running 1024x768 with no AA/AF. You know, cuz it's so CPU limited and all.

not sure what the difference is as they seem to use the same test methods, but the scores there don't change the fact that the previous scores show a cpu limited situation.
 
Add another DX9 game that to the not-so-hot FX list.

THG-Joint Operations

9800pro ?54.2
5950U ?34.8
9600XT?33.9
5900XT ? 28.3
5700U ? 23.1

Must be a ?few? DX9 shaders in that game. But even a few DX9 shaders constitutes a ?heavy? shader load for the FX cards. 😛
 
Cloaking devices are supposed to make you invisible, not make you stand out more.
Not according to the game developer Gearbox; nVidia cards are not rendering the camo how it was meant to be rendered.

No, I'm not. FC uses shaders for a very small portion of its on screen elements and those shaders that it uses outside of pipes and water are simplistic.
If you think so then try running in on a DirectX 7 board and look at the difference, and then do the same for Doom III.

The bulk of D3's shaders are just doing simple multi-texturing; in terms of flashy effects you only get the odd haze or light effect here and there Not so in FC where the entire game is built around shaders.

D3 uses shaders for almost every pixel in the entire game.
Likewise FC except the difference is that FC does much more with them than simple multi-texturing. That has been proven given the crippling performance the NV3x displays when running the SM 2.0 path.

Doesn't add up to a heavy shader load.
The R3xx runs DEIW (and implicitly Thief 3) much faster than the NV3x does. That's yet another title to add to the growing list.

On any part with remotely decent shader performance that would be right. Unfortunately we don't have any of those yet.
Excuse me? Have you somehow missed the benchmark results where the NV3x is sometimes half the speed of the R3xx? This thread is filled with shader based titles that repeatedly demonstrate this fact so how long will you continue to deny the results?

And that is the big problem with all of the PR hype around shaders.
The benchmarks speak for themselves; nothing to do with hype.
 
Back
Top