FS Splinter Cell Chaos Theory SM3 Article

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Noob
Originally posted by: TheSnowman
Originally posted by: Noob
Too bad the game don't support SM 2.0b. Us X800 users would have got it's performance increases and IQ as SM 3.0 (HDR pending). It was probably a behind-the-scenes deal to promote 6800's not to include support for SM 2.0b. Oh well, the game still looks good and is good. I'm on the 5th mission. Somewhat complicated storyline.

Actually there is Radeons that can do the HDR which is used in SC3

It depends if they wanted to implement it though. That is what I meant. HL2 is gonna have HDR implemented soon.

Doh, i can't belive i left out the "no" there. SM3 or not, the style of HDR used in SC3 simple won't work on any Radeon. HL2's HDR will be cool to see, but it is going to suffer from the limitations imposed by a interger framebuffer.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
"Lowly SM1.1 and reduced IQ" seems to be stretching it, judging by the comparison pics FS put up. Note that the X800s appear to be about 10% faster than the 6800s, which is basically the difference b/w SM1.1 and SM3. Props to the 6800GT for tying the X800XL with what appears to be SM3's ever so slightly brighter lights and shinier reflections. The option for HDR is also nice, despite its performance hit.

You can't blame nV for paying devs to code for their cards, tho, and you can't blame devs for taking the money to add features that they otherwise wouldn't have deemed cost-effective. So, I agree that nV didn't cheat anyone out of anything.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Pete
"Lowly SM1.1 and reduced IQ" seems to be stretching it, judging by the comparison pics FS put up. Note that the X800s appear to be about 10% faster than the 6800s, which is basically the difference b/w SM1.1 and SM3. Props to the 6800GT for tying the X800XL with what appears to be SM3's ever so slightly brighter lights and shinier reflections. The option for HDR is also nice, despite its performance hit.

You can't blame nV for paying devs to code for their cards, tho, and you can't blame devs for taking the money to add features that they otherwise wouldn't have deemed cost-effective. So, I agree that nV didn't cheat anyone out of anything.

You're probably right about the "lowly SM1.1" but you know every other guy on this board said HL2 in DX8 was "totally inferior" to HL2 in DX9 back in the "Shady Days" wars when those "shinier pipes and water" were given as the main reason to buy an ATI card and never consider a FX5900.

Definitely couldn't resist flipping that around after getting it thrown at me about 1,239,745X. If it was a good enough argument for the 9800 vs the 5900, it must be good enough for the 6800 vs the X800, wouldn't you say?
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: Rollo
nVidia didn't "cheat" anyone out of anything- the developer of the game opted to use the MS standard?

It's not their fault ATI hasn't put an SM3 card on the market yet, or that some people have opted not buy SM3 cards.

In any case, I don't see how posting a link to a good article that examines the differences between SM1.1 and SM 3 in this popular new game is "patting nV on the back"?

You could as easily say I'm flaming nV, the benchmarks show the X800 cards winning the benchmarks, albeit at reduced IQ.

What Ubisoft did would be the equivelent of having farcry run at dx9 and dx7 modes only. Is it the companies fault that nvidia did not follow dx9 standards? If the FX5900 and the 9800 Pro were the only cards out at the time, nvidia owners would be sol. Companies should try to make their game as compatible as possible, catering to the entire gaming community will result in more profits than it would be to cater to only one company that has less than a third of the market.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: dguy6789
Companies should try to make their game as compatible as possible, catering to the entire gaming community will result in more profits than it would be to cater to only one company that has less than a third of the market.

Apparently Ubisoft disagrees with you. In any case, it can only be said they're catering to one company now, perhaps they're counting on the R520 lines SM3 support to get them ATI user sales.

In any case, like I said before, there have always been vendor specific implementations of features. No point bemoaning this fact.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Rollo
Originally posted by: Pete
"Lowly SM1.1 and reduced IQ" seems to be stretching it, judging by the comparison pics FS put up. Note that the X800s appear to be about 10% faster than the 6800s, which is basically the difference b/w SM1.1 and SM3. Props to the 6800GT for tying the X800XL with what appears to be SM3's ever so slightly brighter lights and shinier reflections. The option for HDR is also nice, despite its performance hit.

You can't blame nV for paying devs to code for their cards, tho, and you can't blame devs for taking the money to add features that they otherwise wouldn't have deemed cost-effective. So, I agree that nV didn't cheat anyone out of anything.

You're probably right about the "lowly SM1.1" but you know every other guy on this board said HL2 in DX8 was "totally inferior" to HL2 in DX9 back in the "Shady Days" wars when those "shinier pipes and water" were given as the main reason to buy an ATI card and never consider a FX5900.

Definitely couldn't resist flipping that around after getting it thrown at me about 1,239,745X. If it was a good enough argument for the 9800 vs the 5900, it must be good enough for the 6800 vs the X800, wouldn't you say?


We didnt know the FX would be set as DX8 by default on Shader Day. All we knew, was they told us the 9800 Pro would be much faster. And it is, when both running DX9.

Its not even a parallel comparison that you are trying to use. The FX cards can be forced to run in DX9 mode in HL2, with a simple command line change to the shortcut. 2.0 cannot be enabled at all, for any card, in Chaos Theory. Which makes your comparison invalid.

 

AnnoyedGrunt

Senior member
Jan 31, 2004
596
25
81
To me, it looks like there is nothing very surprising about these results.

1. The IQ difference is marginal, with the SM3 looking slightly brighter, but not significantly different at all (at least that is what I am seeing from the screenshots, but maybe the difference is more significant in actual gameplay).

2. SM3 provides a slight speed boost for the 6800 series

3. HDR effectively limits your resolution to 1024x768 with no AA

4. The X800 series is still faster than the 6800 series

So, I see this as a case where you really can't go wrong with either the X800XL or 6800GT. The 6800 gives you slightly lower performance with AA and AF, but also a slight IQ improvement with slightly better lighting. Also, you do have the option of using HDR if you want to run @ a lower resolution without AA.

The X800XL performs slightly better @ high resolutions with AA and AF, and is significantly less expensive (in PCIe interface), but doesn't have the same capabilities.

IMO, there is no clear winner, and IF the game used SM2, then NV would have fewer advantages to justify the higher price. I would say that this is pretty much a worst case scenario for ATI, and their product is not outclassed by much if at all.

Overall, it seems to be yet another indication that both cards are very good this time around.

-D'oh!
 
Jun 14, 2003
10,442
0
0
Originally posted by: Regs
From what I get from the interview with the Head Engineer who made the game is that they were planning to create a SM 2.0 path before it's release. This article was dated Oct.2004. Between then and now, they scratched that idea completely due to "technical difficulties". Also stating they would have to rebuild the graphic engine to support both 2.0 and 3.0. CT is a game that was made to display the advantages of SM 3.0. It's perfect for it. CT is darker than Doom 3, and I don't see many people bitch'en about that do you?

Now you take it into account what has happened was all due to corporate favoritism. There was no deal made with Nvidia. SM3.0 is a standard, so the decision was an easy one made by Ubisoft. Did you really think Ubisoft was going to start CT from the ground up again losing millions of dollars in development when they knew ATi cards could play it perfectly well? I don't know what you're talking about Larry! It was a well calculated decision made by Ubisoft. You make it seem like CT can't run at all on ATi cards.

In other news. HL2 is creating a HDL patch for SM3.0 that is set to release anytime soon. Far Cry 2 also has patch 1.3 which also includes optimizations for SM 3.0. And hey, guess who made Far Cry? UBI f'ing soft!

This is all FUD. It seems like everybody loses their common sense when there is a hardware war raging on the frontlines of the great American marketing machine. If the hardware that they made cannot influence the game developers or the market, then obviously wouldn't have much a market at all.

To build an empire you must have influence, management, and leadership. Let ATi and Nvidia compete with each other instead of crying "cheating" or "foul play". What they are doing is perfectly confounded by fair business practices. There is nothing "sinister" about it. When I owned a 9800 from ATI I was hearing it from Nvidia owners all to often and it made my stomach cringe.


nice

i agree
 
Jun 14, 2003
10,442
0
0
Originally posted by: AnnoyedGrunt
To me, it looks like there is nothing very surprising about these results.

1. The IQ difference is marginal, with the SM3 looking slightly brighter, but not significantly different at all (at least that is what I am seeing from the screenshots, but maybe the difference is more significant in actual gameplay).

2. SM3 provides a slight speed boost for the 6800 series

3. HDR effectively limits your resolution to 1024x768 with no AA

4. The X800 series is still faster than the 6800 series

So, I see this as a case where you really can't go wrong with either the X800XL or 6800GT. The 6800 gives you slightly lower performance with AA and AF, but also a slight IQ improvement with slightly better lighting. Also, you do have the option of using HDR if you want to run @ a lower resolution without AA.

The X800XL performs slightly better @ high resolutions with AA and AF, and is significantly less expensive (in PCIe interface), but doesn't have the same capabilities.

IMO, there is no clear winner, and IF the game used SM2, then NV would have fewer advantages to justify the higher price. I would say that this is pretty much a worst case scenario for ATI, and their product is not outclassed by much if at all.

Overall, it seems to be yet another indication that both cards are very good this time around.

-D'oh!

false

im playing right now SM3 all the bells and whistles, 1280x1024 with 8xaf.

AA isnt really needed since its either too dark to see jaggies at that res, or ive got the nightvision on which blurs lines anyway.

and since i turned FRAPS off during play, its very smooth.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: otispunkmeyer

false

im playing right now SM3 all the bells and whistles, 1280x1024 with 8xaf.

AA isnt really needed since its either too dark to see jaggies at that res, or ive got the nightvision on which blurs lines anyway.

and since i turned FRAPS off during play, its very smooth.

Not another dark nvidia supported game. ;) Why not more games like farcry, nice tropical islands, makes me feel good during a Canadian winter and spring.

 
Jun 14, 2003
10,442
0
0
Originally posted by: ronnn
Originally posted by: otispunkmeyer

false

im playing right now SM3 all the bells and whistles, 1280x1024 with 8xaf.

AA isnt really needed since its either too dark to see jaggies at that res, or ive got the nightvision on which blurs lines anyway.

and since i turned FRAPS off during play, its very smooth.

Not another dark nvidia supported game. ;) Why not more games like farcry, nice tropical islands, makes me feel good during a Canadian winter and spring.


ive always found the SC series to be curiously dark, every level even in lit buildings everythings just unrealistically dark, but snapping knecks, kindney jabs, and gut stabbings makes up for it
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Regs
From what I get from the interview with the Head Engineer who made the game is that they were planning to create a SM 2.0 path before it's release. This article was dated Oct.2004. Between then and now, they scratched that idea completely due to "technical difficulties". Also stating they would have to rebuild the graphic engine to support both 2.0 and 3.0.B CT is a game that was made to display the advantages of SM 3.0. It's perfect for it. CT is darker than Doom 3, and I don't see many people bitch'en about that do you?

Now you take it into account what has happened was all due to corporate favoritism. There was no deal made with Nvidia. SM3.0 is a standard, so the decision was an easy one made by Ubisoft. Did you really think Ubisoft was going to start CT from the ground up again losing millions of dollars in development when they knew ATi cards could play it perfectly well? I don't know what you're talking about Larry! It was a well calculated decision made by Ubisoft. You make it seem like CT can't run at all on ATi cards.

In other news. HL2 is creating a HDL patch for SM3.0 that is set to release anytime soon. Far Cry 2 also has patch 1.3 which also includes optimizations for SM 3.0. And hey, guess who made Far Cry? UBI f'ing soft!

This is all FUD. It seems like everybody loses their common sense when there is a hardware war raging on the frontlines of the great American marketing machine. If the hardware that they made cannot influence the game developers or the market, then obviously wouldn't have much a market at all.

To build an empire you must have influence, management, and leadership. Let ATi and Nvidia compete with each other instead of crying "cheating" or "foul play". What they are doing is perfectly confounded by fair business practices. There is nothing "sinister" about it. When I owned a 9800 from ATI I was hearing it from Nvidia owners all to often and it made my stomach cringe.

Oh, I apparently missed this post last night. That is just funny "made to display the advantages of SM 3.0. It's perfect for it", eh? I suppose you liked Nvidia's PS3.0 in Far Cry compasion shots as well, eh? That is just absurd, you can't do a good job of showing the advantages of something by delibrately cripliing the compitition. When that is done it is very much foul play, if you can't see that in this example then can you at least see it in the case against Tonya Harding?
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
So you would say snowman that if you change HL2s DLLs thorugh a mod to detect a 6000 series geforce as a 9800 pro and it fixes all IQ problems and increases performance, that it was completely accidental?

Also coincidently ATis logo is on every single CD and the box.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Acanthus
So you would say snowman that if you change HL2s DLLs thorugh a mod to detect a 6000 series geforce as a 9800 pro and it fixes all IQ problems and increases performance, that it was completely accidental?

Also coincidently ATis logo is on every single CD and the box.


If correct, that would be wrong and would justify Nvidia games crippling Ati cards. :)
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Yeah, if correct that is. however Acanthus, please show me this mod so I can check it out on my 6800gt; or where you just just talking theoreticly?

 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Just wanted to say, i dont know why you guys cant see this on SM1.1 but there are tons of banding effects with the lighting refelections.

It looks plain ugly, even when your totally engrossed in the game, ull just see this ugly banding effect.

Funny thing is they didnt need to code SM1.1 extra for the PC, it is being used in the Xbox. So when porting the game, to PC all they had to do was SM3. They could have spent more time on SM2 at least. Instead of just giving up at a blink of an eye at some technical difficulties.

Tho maybe its actually in the game, but they are sorting the problems out now, and are gonna release a patch with the fix and activate it for us?

Would anyone be able to find any info on SM2 in the game files?

What i also find freaky is that in the config files HDR, Soft Shadows and Parallax mapping are all enabled even tho i have an X850. Well they are all on 1, unless they take 0 as enable instead?
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
This is the program that can force emulation of a Radeon 9800 or X800 in half life. It causes the game to falsly identify graphics cards as different cards and run as if they were those cards.

Its supposed to make a large difference in HL2 for the FX and 6000 series if you choose Radeon 9800 or X800.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Rollo
You're probably right about the "lowly SM1.1" but you know every other guy on this board said HL2 in DX8 was "totally inferior" to HL2 in DX9 back in the "Shady Days" wars when those "shinier pipes and water" were given as the main reason to buy an ATI card and never consider a FX5900.

Definitely couldn't resist flipping that around after getting it thrown at me about 1,239,745X. If it was a good enough argument for the 9800 vs the 5900, it must be good enough for the 6800 vs the X800, wouldn't you say?
Heh, I was probably one of those people, and was convinced of it right up till that guy posted a way to get DX9-quality water in DX8 mode with GeForce (4 and FX) cards with a minimal performance hit.

Now, now, Rollo, there's no need to be childish about it. ;)

If SM1 mode does indeed show banding, then that's something FS should have pointed out, and a strike against Ubisoft for not providing a SM2 path. Banding sucks, as the Voodoo's "22-bit" mode showed.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Yeah Pete, that is exactly my point. Sure maybe they didn't have the time or whatever to code parallax mapping and soft shadows into a ps2.0 path; that is fair, a bit lazy I'd say but fair all the same. However, they would barly have to even lift a finger to alow all DX9 hardware to run with floating point precision in the shaders, thereby eliminating that nasty banding for just about anyone who bought a videocard in the past 2 years.

Originally posted by: Acanthus
This is the program that can force emulation of a Radeon 9800 or X800 in half life. It causes the game to falsly identify graphics cards as different cards and run as if they were those cards.

Its supposed to make a large difference in HL2 for the FX and 6000 series if you choose Radeon 9800 or X800.

Yeah, I know 3danalyze; but best I "supposed to" is nothing but a silly rumor. I just ran the video stress test normally and then with the 9800 vendor and device ids to find only a .06 fps differnce, favoring running it normal.
 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,345
136
www.teamjuchems.com
They didn't include 2.0 or 2.0b because that TAKE EXTRA TIME TO CODE. SM3 made it easier to code some of these things (advanced pixel shading and the like), and so coding for anything else would have been work, which equals more dev time, which adds to the expense. That said, I am sure that NVIDIA paid some to make sure that it ran well on NVIDIA hardware, why do you think that Ubisoft has the cool "NVIDIA" thing in there opening credits? I am sure that the 1.1 functionality was already written into the engine, Epic made it easy to use like that, but notice UT2k4 didn't use anything above 2.0, so that functionality had to be added by UBISOFT. Really this is a moot point, because if this game is still being played in year or two, all cards being sold will have this functionality.

What I find is really cool is that perfomance increase the SM3 brings to the table for NVIDIA.

Cool it everyone! :)
Nat
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: blckgrffn
They didn't include 2.0 or 2.0b because that TAKE EXTRA TIME TO CODE.


Do I need to put the post above yours in caps to have it make sense to you?
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: TheSnowman
Yeah Pete, that is exactly my point. Sure maybe they didn't have the time or whatever to code parallax mapping and soft shadows into a ps2.0 path; that is fair, a bit lazy I'd say but fair all the same. However, they would barly have to even lift a finger to alow all DX9 hardware to run with floating point precision in the shaders, thereby eliminating that nasty banding for just about anyone who bought a videocard in the past 2 years.

Originally posted by: Acanthus
This is the program that can force emulation of a Radeon 9800 or X800 in half life. It causes the game to falsly identify graphics cards as different cards and run as if they were those cards.

Its supposed to make a large difference in HL2 for the FX and 6000 series if you choose Radeon 9800 or X800.

Yeah, I know 3danalyze; but best I "supposed to" is nothing but a silly rumor. I just ran the video stress test normally and then with the 9800 vendor and device ids to find only a .06 fps differnce, favoring running it normal.


I get my 6800GT tomorrow, ill post results later.

I found out about this shortly after my 6600 died.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
What's the big deal if it only supports ps 1.1 or 3? If this game was originally developed for xbox, then it makes sence to add ps3 code to the engine, as it already had 1.1 from the xbox version. Adding ps2 code would be nice, but if it costs too much, then I can understand why they didnt add it. And as far as I can tell, the IQ benefits of ps3 are only marginal, not something I would really care about, especially if the gameplay was that good. On the other hand, if the gameplay is crap, then no eye candy in the world is gonna make be buy the game.

Also, IMO HDR is not worth giving up AA, there's just no excuse to buy a modern $300+ video card and still have to deal with jaggies. If I had a choice between HDR and AA, I'd take the AA without a second thought.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: munky
Adding ps2 code would be nice, but if it costs too much, then I can understand why they didnt add it.

But since it doesn't "costs too much" by any means to simply alow the standard shaders to run with floating point pression offered by all DX9 hardware; then can you see why it it is dissapointing that they did not provide a SM2 path to do that?
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Can anybody answer this?
How can SM1.1 be slower than SM3.0? I mean I would have understood that if the game was coded for SM2.0,but how can poor SM1.1 be slower in performance than SM3.0?