Nvidia smoke rendering vs ATI in DX9?

rogue1979

Diamond Member
Mar 14, 2001
3,062
0
0
First of all, I use both Nvidia and ATI and like them both. I have several 9800GT's, 8800GT's,
HD 4830's and even an HD 3870 running on different systems. I don't feel the need to upgrade anything yet 'cuz the resolutions I use in the games I play have no lag, good enough for me:D

My question is about Nvidia smoke rendering in WinXP (DX9). It looks more like layers of translucent material stacked together that makes a pretty good resemblance to smoke or fog. Where as the ATI's smoke or fog rendering looks more 3 dimensional and much closer to the real thing.

Is this a DX9 limitation for Nvidia that looks better in DX10? Just wondering 'cuz I don't have Vista loaded on any of my systems, but I guess if it helps out the Nvidia smoke rendering I can give it a try.

I am sure other people have noticed this because it is, well, really noticeable!

I am not really sold on Vista/DX10 being needed for great image quality. I also game with PS3 on a 42" LCD TV and it looks great, and that's no DX9 or 10, it's based on Linux.

Maybe I am missing out on DX10? I don't know for sure since I'm not using it. What do you guys think?
 
Jan 24, 2009
125
0
0
Hm, I personally haven't used an NV card in a couple years, so I can't really say anything in regards to smoke.

The only, for lack of a better term, actual physical difference I've noticed between ATI and NV was in KotOR the Sith troopers would be gold on an ATI card, as opposed to silver on an NV card. I was devastated when I found out they were canonically silver. I thought the gold looked so nice.
 

nenforcer

Golden Member
Aug 26, 2008
1,777
20
81
I bought my Vista machine 2 years ago with a GeForce 8800GTS 640MB expecting to be ready in time for any DX10 games.

Just last fall I upgraded to two 9800GT in SLI. I played COD4 (which is DX9) and then the Lost Planet DX10 demo.

I have purchased Bioshock which I believe has very minor DX10 graphics.

As far as I know there are still no big DX10 only games that aren't DX9. The Windows XP market base is still way to large.

Company of Heroes is the only other DX10 game I can think of, maybe Supreme Commander? Stalker and Stalker : Clear Sky.

No AAA titles however yet.

COD4 Modern Warfare 2 maybe be the first DX10 AAA title when it comes out this fall.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
There are various ways of rendering smoke, one of the oldest being to 'stack' layers of translucent polygons.
This has nothing to do with the brand of hardware, and not all that much with the API used either. It's about the decision the developer makes.
Smoke is not an 'effect' that you just 'turn on' on the hardware. Sorry, but I still can't understand why people think about hardware in such a naive way. They're processors. Do you also think that eg browsing the internet is a 'feature' of your CPU, which you 'turn on'? And maybe that the internet looks better on an AMD CPU than on an Intel CPU? No, it's just a piece of code, which looks the way the developer intended it. The processor just executes the instructions and doesn't really know what a browser is, or what a smoke effect is or whatever.

I'm not sure what you're basing your conclusions on, but if you are comparing two different games, it says nothing about the hardware.
If you're comparing the same game, running with the same settings, then regardless of the brand it should look the same (with the exception of Radeon 9x00 series, which had a off-by-one bug in the alphablending unit), unless the developler specifically chose to implement different techniques.

The DX SDK actually comes with an example of various types of smoke. I've run it on various hardware, including nVidia, ATi and Intel, and never saw any differences.

In short: you're not making sense.
 

rogue1979

Diamond Member
Mar 14, 2001
3,062
0
0
I'm comparing the same games on DX9 (WinXP), Noticeable in Graw2, COD4, Arma, Grid and a few others. 9600GT, 9600GSO, 8800GT and 9800GT for Nvidia, HD 4670, HD 4830 and HD 3870 for ATI. All using 1280 x 1024 4X AA and 16X AF.

It doesn't make sense to me either, that's why I posted. The smoke rendering looks different between the Nvidia and the ATI. Testing several different computers with several different video cards proves to me that the smoke rendering is different, it's not a random thing. I've tried several different driver versions for both, and the difference is still exhibited.

I was hoping for a response from someone that had both Nvidia and ATI running side by side in DX9 and noticed the same thing.

"In short: you're not making sense." Thank you! I'm sure you could have worded that differently, I don't see that comment as being helpful in any way.:frown:
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: SirPauly
Is it possible to take some screenshots to point out the differences?

Agree. Really need to see a screenshot to see what you are referring to.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Scali
There are various ways of rendering smoke, one of the oldest being to 'stack' layers of translucent polygons.
This has nothing to do with the brand of hardware, and not all that much with the API used either. It's about the decision the developer makes.
Smoke is not an 'effect' that you just 'turn on' on the hardware. Sorry, but I still can't understand why people think about hardware in such a naive way. They're processors. Do you also think that eg browsing the internet is a 'feature' of your CPU, which you 'turn on'? And maybe that the internet looks better on an AMD CPU than on an Intel CPU? No, it's just a piece of code, which looks the way the developer intended it. The processor just executes the instructions and doesn't really know what a browser is, or what a smoke effect is or whatever.

I'm not sure what you're basing your conclusions on, but if you are comparing two different games, it says nothing about the hardware.
If you're comparing the same game, running with the same settings, then regardless of the brand it should look the same (with the exception of Radeon 9x00 series, which had a off-by-one bug in the alphablending unit), unless the developler specifically chose to implement different techniques.

The DX SDK actually comes with an example of various types of smoke. I've run it on various hardware, including nVidia, ATi and Intel, and never saw any differences.

In short: you're not making sense.


Yea, but can't the drivers change things? I can't get to this link from work, but if this is the link I'm thinking of it clearly shows that things are indeed rendered differently on Radeon's vs. Geforce video cards.

PCGamershardware link. If I remember right there are screen shots where the textures are more 'flat' on the Nvidia cards and there is more foliage on the Radeon cards, among other differences.
 

rogue1979

Diamond Member
Mar 14, 2001
3,062
0
0
Originally posted by: SlowSpyder
Yea, but can't the drivers change things? I can't get to this link from work, but if this is the link I'm thinking of it clearly shows that things are indeed rendered differently on Radeon's vs. Geforce video cards.

PCGamershardware link. If I remember right there are screen shots where the textures are more 'flat' on the Nvidia cards and there is more foliage on the Radeon cards, among other differences.

While not showing any active smoke, you can see still differences in background haze, clouds, foliage, lighting and shadows in your link. Part of the exact same thing that I see.

But if it"s active smoke that you look or move through, the differences are more than minor or subtle. It's can actually effect game play to the extent of being able to see or target an enemy, or perhaps view the background through the smoke or fog.

 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: SlowSpyder
Yea, but can't the drivers change things? I can't get to this link from work, but if this is the link I'm thinking of it clearly shows that things are indeed rendered differently on Radeon's vs. Geforce video cards.

Yes, but generally they won't go as far as completely changing the rendering method.
What he describes in his opening post, is that nVidia uses the ancient method of textured quads, where AMD uses a more correct volumetric approach. I've never seen such differences, and I doubt either nVidia or AMD would go that far in altering a game (the amount of effort doesn't justify the gains in performance, if any).

Even so, it says nothing about the hardware and their capabilities. It would just be an isolated case. Again, check the smoke example in the DX SDK. Both nVidia and AMD cards can handle all methods without problems.
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
Yeah there is definitely some difference between the way two brands render 3D. Despite same games, same scenes I see subtle differences in rendering all the time. And believe it or not, 2D desktops (not to mention hardware-assisted video playback) on AMD vs. NV look different as well. I first thought it's my monitor, but the experience was similar on a different panel as well. I couldn't understand why and still can't, especially when it comes to 2D desktop. (I mean, it's supposedly digital..) There are probably ways to correct(?) it and make the two desktops look identical, but I haven't figured them out yet.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Originally posted by: SlowSpyder
Originally posted by: Scali
There are various ways of rendering smoke, one of the oldest being to 'stack' layers of translucent polygons.
This has nothing to do with the brand of hardware, and not all that much with the API used either. It's about the decision the developer makes.
Smoke is not an 'effect' that you just 'turn on' on the hardware. Sorry, but I still can't understand why people think about hardware in such a naive way. They're processors. Do you also think that eg browsing the internet is a 'feature' of your CPU, which you 'turn on'? And maybe that the internet looks better on an AMD CPU than on an Intel CPU? No, it's just a piece of code, which looks the way the developer intended it. The processor just executes the instructions and doesn't really know what a browser is, or what a smoke effect is or whatever.

I'm not sure what you're basing your conclusions on, but if you are comparing two different games, it says nothing about the hardware.
If you're comparing the same game, running with the same settings, then regardless of the brand it should look the same (with the exception of Radeon 9x00 series, which had a off-by-one bug in the alphablending unit), unless the developler specifically chose to implement different techniques.

The DX SDK actually comes with an example of various types of smoke. I've run it on various hardware, including nVidia, ATi and Intel, and never saw any differences.

In short: you're not making sense.


Yea, but can't the drivers change things? I can't get to this link from work, but if this is the link I'm thinking of it clearly shows that things are indeed rendered differently on Radeon's vs. Geforce video cards.

PCGamershardware link. If I remember right there are screen shots where the textures are more 'flat' on the Nvidia cards and there is more foliage on the Radeon cards, among other differences.

I agree with that site's findings on filtering by the way in that link and why I have been advocate of nvidia's fine filtering this generation.

 

wrangler

Senior member
Nov 13, 1999
539
0
71
Originally posted by: SirPauly


I agree with that site's findings on filtering by the way in that link and why I have been advocate of nvidia's fine filtering this generation.

Yep. I like my 4890 but it does do the flickering thing they're talking about. Can be iritating at times. I've always wanted to put into words what I was seeing, because it can be fleeting and the kind of thing that just barely catches your eye, but that guy hit it exactly.

Nothing that would keep me from considering the 5870 when it comes out mind you.......but I WILL wait to see what nVidia comes out with for this next round.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: lopri
Yeah there is definitely some difference between the way two brands render 3D. Despite same games, same scenes I see subtle differences in rendering all the time. And believe it or not, 2D desktops (not to mention hardware-assisted video playback) on AMD vs. NV look different as well. I first thought it's my monitor, but the experience was similar on a different panel as well. I couldn't understand why and still can't, especially when it comes to 2D desktop. (I mean, it's supposedly digital..) There are probably ways to correct(?) it and make the two desktops look identical, but I haven't figured them out yet.

Yes, but those things can easily be explained.
nVidia and AMD have different approaches to texture-filtering. nVidia has a more 'correct' way of anisotropic filtering than AMD does, which leads to 'sharper', more 'noisy' textures on AMD cards.

Video decoding acceleration is done by a custom driver, where each vendor can apply its own optimizations, cheats and quality improving post-processing filters. So it's even less of an apples-to-apples comparison than games (with DX9/10 and OpenGL there are at least SOME rules for correct rendering, with video encoding/decoding there are no rules whatsoever).

As for the general 2d look, that probably has to do with different gamma ramps applied by the hardware. You can probably tweak the control panel sliders for gamma, contrast, brightness etc to get both to look the same, more or less, but the defaults will most likely be different.

But all these things aren't necessarily brand-related. Different generations of hardware or even different driver versions can also introduce changes here.