Splinter Cell: Chaos Theory patch

EightySix Four

Diamond Member
Jul 17, 2004
5,122
52
91
Ubisoft released a new patch for splinter cell, in this update they included support for Shader Model 2.0, which was previously unsopported. This allows ATI cards and older nvidia to have the full graphical glory the 6*00's and 7800gtx has had since it's release.


Please don't flame this thread, I just thought everyone would like to know.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Nice to see UBi finally came to their senses.

Splinter Cell Chaos Theory v1.04 Patch Contents

Bug Fixes

* Fixed divergence after restarting/quick loading with users using non-default control settings.

General Improvements

* ATI Shader Model 2.0 support has been implemented.

Highlights of patch 1.03 include various fixes and minor enhancements.

Bug Fixes

* Partially corrected skinning crashes when AlienGUIse is present.
* Fixed divergences caused by mouse interactions after quickloading.
* Fixed Network NAT traversal issues through certain routers
* Intel IG fixes (if software vertexprocessing not supported, disabled mouse delag if event queries are not supported)
* Fixed graphic problems resulting in one of the player losing his HUD and being unable to use night vision and messed-up thermal vision. This happened for users playing against another player under particular hardware configurations.
* Fixed crash when menu owner gets deleted before the disconnection is processed.
* Fixed disconnection if the client was in the in-game chat while the host was restarting the game.
* The Voice chat port is now properly punched through for connection through router.
* [Versus] Corrected the Uzi problem with ATI Radeon 9700 in Versus.
* [Versus] The host no longer see where bombs are placed on the radar. He has to seek for them, just like clients.
* [Versus] In the Aquarius map, a Mercenary grabbed by a Spy doesn't get stuck in the crunching box; he dies with the Spy.
* [Versus] The tutorial mode appears clearly for everyone in the lobby.
* [Versus] Correction on a Camera bug that always displays its detection information.

General Improvements

* Added a specific message when the Coop player is disconnected because the partner left.
* Connection to ubi.com through HTTP proxy is now properly supported. If you are using an HTTP proxy, you must edit the SplinterCell3.ini file usually located (depending on your system) in:
o C:Documents and SettingsAll UsersApplication DataUbisoftTom Clancy's Splinter Cell Chaos TheorySplinterCell3.ini
o Then, add the following lines in your splintercell3.ini under the [UBI.COM] section with your web proxy settings:
o [UBI.COM]
ProxyAddress=
ProxyPort=
o Note: By default, this folder is hidden by Windows. To browse it, you must do the following in Explorer:
o Select Tools->Folder options->View and check the "Show hidden files and folders" radio button.
* [Versus] A new message from the host if the clients leave the game via the "Quit game" button.

v1.02 Bug Fixes

* A *LOT* of disconnections have been corrected in Coop mode.
* Maximum username length was inconsistent with the one used on ubi.com web site.
* Fix session connection issues when connecting to an online game.
* Fix impossibility to join a game (and the user having to hard quit the game) after joining a game that he was banned from.
* Ingame chat should now be replicated properly.
* Maximum timeout has been slightly increased to help low-end machines connected to high end ones.
* Corrected crash when saving in a Coop game using a wrong Game Saving functionality. Note: In order to save, you must open the Pause menu and save from there.
* Fix some issues with event not happening the same way on the two computers, thus resulting in a disconnection.
* Fix some issues with passworded online games.

General Improvements

* All Seeing Eye support has been implemented.

Originally posted by: crazySOB297
This allows ATI card to have the full graphical glory the 6*00's and 7800gtx has had since it's release.

Why just ATi cards? There are plenty of capable NV card users, that got shafted too. Such as all the FX cards from the previous years. All those users got shafted by their laziness as well. Now they too can play with better graphics.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Unless FX cards cant do it for some reason? Rumor is ATi did the code work, and gave it to UBi. If FX users cant, then they got shafted, again. The readme does say, "ATI Shader Model 2.0 support has been implemented."
 

thecoolnessrune

Diamond Member
Jun 8, 2005
9,673
583
126
Originally posted by: crazySOB297
Ubisoft released a new patch for splinter cell, in this update they included support for Shader Model 2.0, which was previously unsopported. This allows ATI cards and older nvidia to have the full graphical glory the 6*00's and 7800gtx has had since it's release.


Please don't flame this thread, I just thought everyone would like to know.


If SM 2.0 was the same as 3.0, there wouldn't be a 3.0. Therefore it probably wont be the full* graphical glory. But pretty close anyways.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Does this mean it will have soft shadows using sm2, and all the sm3 fanboys will have one less argument for their cause?
 

imported_tss4

Golden Member
Jun 30, 2004
1,607
0
0
Originally posted by: munky
Does this mean it will have soft shadows using sm2, and all the sm3 fanboys will have one less argument for their cause?

why are you anti sm3? My understanding was you could do just about everyting in sm2 as you could in sm3, but sm3 was a little faster.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: tss4
Originally posted by: munky
Does this mean it will have soft shadows using sm2, and all the sm3 fanboys will have one less argument for their cause?

why are you anti sm3? My understanding was you could do just about everyting in sm2 as you could in sm3, but sm3 was a little faster.

You're right, and I'm not against sm3. BUT, you forgot how for the past year certain people claimed sm3 as the major reason why the gf6 cards were so much better and more future-proof than the x800 series cards, while only a few games utilized sm3, this being one of them. Now that you could get the same eye candy in sm2, there's one less game that requires sm3 to look good. And when the final version of FEAR comes out, we'll see if any of the gf6 cards are fast enough to run the sm3-exclusive eye candy, for all their future-proofness
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Speaking of Fear, there is a demo coming out this Friday. Gamespot exclusive I think though.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Originally posted by: munky
Originally posted by: tss4
Originally posted by: munky
Does this mean it will have soft shadows using sm2, and all the sm3 fanboys will have one less argument for their cause?

why are you anti sm3? My understanding was you could do just about everyting in sm2 as you could in sm3, but sm3 was a little faster.

You're right, and I'm not against sm3. BUT, you forgot how for the past year certain people claimed sm3 as the major reason why the gf6 cards were so much better and more future-proof than the x800 series cards, while only a few games utilized sm3, this being one of them. Now that you could get the same eye candy in sm2, there's one less game that requires sm3 to look good. And when the final version of FEAR comes out, we'll see if any of the gf6 cards are fast enough to run the sm3-exclusive eye candy, for all their future-proofness

You are an idiot. Obviously you haven't read anything about SM3 have you? SM3 offers loads of graphical improvements over SM2, but since you dont seem to know anything about it, I'll just leave you to go find out on your own. Or are you too much of a SM2 fan to look at it?

FEAR...um, excuse me, but benchmarks for it were on very high settings, the game was in beta, and perhaps it is coded poorly.

If a 6 series card isnt good enough for next gen, then the x800 series sure as hell isn't, in fact they'd be worse off.

Just thought I'd like you to know. If 6 series goes down, your x800 line does to. So why not get Sm3 if there's even a chance we'll be able to play next gen games at sm3?

And last time i checked, the UE3 developers claimed that a 6600gt would run the engine fine. Maybe not with eye candy, but i'm pretty sure they picked that card (they coulda said an x800 will run it) because of its SM3 capabilities.

You're right though, there are not many SM3 games out. That is what next gen is for. So don't put the lack of SM3 games against SM3. I'll say this: no games...i repeat...NO games out now that use SM3 are using any of its real eye candy features, save a few that are used with SM3 in SC:CT...oh, but there's much more it can do. Example? UE3 engine? almost any next gen game? They used SM3 to get that graphics quality. I dont think it'd be as nice with SM2.

And isn't it funny that every new engine had to be built off of 6800gt's in sli??? Just for the SM3 support.

It seems it will be use alot soon. So i'm not disappointed with getting my 6600gt. Guaranteed to perform well on UE3 games (which i'd assume it would run other engines well too) and has SM3...I'd say i'm more ready for next gen than x800 users are.

 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: hans030390
Obviously you haven't read anything about SM3 have you? SM3 offers loads of graphical improvements over SM2...
I have read quite a bit about SM3 and I even have a SM3 card, but I have no idea where you come up with the "loads of graphical improvements over SM2" comment. Could you provide some examples?

 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
When they say ATi Shader Model 2.0

They probably mean SM2.0b

Well ill leave all the guys here so that you guys can argue while i see what the graphics look like with SM2.0b turned on with my X850XT-PE!!!
 

aggressor

Platinum Member
Oct 10, 1999
2,079
0
76
Originally posted by: munky
Does this mean it will have soft shadows using sm2, and all the sm3 fanboys will have one less argument for their cause?

On my X800XL I have: HDR Rendering\Tone Mapping; Parrllax Mapping, and High Quality Soft Shadows for options.

Sucks that you can't use AA and the shader options, though :(
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
It's just the "HDR" that you can't use with AA, but it's not even HDR like on a Geforce 6600 and up. So I think it is just the game forceing it off by habbit, and hopefully there is a way around that. The sudo-HDR effect is well done but I wouldn't give up AA for it.

Also, I can't seem to find where the tone mapping option does anything in SM2 (I honestly doubt it does do anything), and I recall the parrallax mapping looks better on the SM3 path. Soft shadows on the other hand are spot on, and most importantly is upgrade to floating point shaders as those fixed point ones on the SM1.1 path could look really nasty at times.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: hans030390

It seems it will be use alot soon. So i'm not disappointed with getting my 6600gt. Guaranteed to perform well on UE3 games (which i'd assume it would run other engines well too) and has SM3...I'd say i'm more ready for next gen than x800 users are.

Where is the link that proves the 6600GT will perform well on U3, let a lone games based on the engine? I have serious doubts the card will be able to run it well at all. It may not matter, if its as craptastic as U2. But then again.. I dont consider frames in the teens "well".
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Ackmed
Originally posted by: hans030390

It seems it will be use alot soon. So i'm not disappointed with getting my 6600gt. Guaranteed to perform well on UE3 games (which i'd assume it would run other engines well too) and has SM3...I'd say i'm more ready for next gen than x800 users are.

Where is the link that proves the 6600GT will perform well on U3, let a lone games based on the engine? I have serious doubts the card will be able to run it well at all. It may not matter, if its as craptastic as U2. But then again.. I dont consider frames in the teens "well".

There isn't one. There was a comment from Epic saying that UE3 would be scalable enough to run on something like a 6600GT (they may have even said that was their target 'low-end' card for the engine; I'm not sure). There's been nothing said yet about the next Unreal game and its requirements; the only thing we've seen so far is a few clips of Unreal Engine 3.

However, the uber-shiny SM3-enabled demos they were running were getting frames in the teens -- on a 6800U SLI. You're not gonna have graphics like *that* on a 6600GT, SM3.0 support or not. There were some later screenshots released of development work on the next UT game -- and it looked around HL2 quality, but with parallax mapping (and some lighting glitches, but they it was supposedly a very early build).
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: TheSnowman
It's just the "HDR" that you can't use with AA, but it's not even HDR like on a Geforce 6600 and up. So I think it is just the game forceing it off by habbit, and hopefully there is a way around that. The sudo-HDR effect is well done but I wouldn't give up AA for it.

Also, I can't seem to find where the tone mapping option does anything in SM2 (I honestly doubt it does do anything), and I recall the parrallax mapping looks better on the SM3 path. Soft shadows on the other hand are spot on, and most importantly is upgrade to floating point shaders as those fixed point ones on the SM1.1 path could look really nasty at times.


Actually the HDR looks exactly like the pics ive seen from the 6800 series.

The parallax mapping look exactly like the pics ive seen from the 6800

If you wanna see what parallax mapping does, in the first level go to the slope with the dripping water after you open the door where you hear morgenholt die. Look at the wall on the left as you go down, if you creep up along side it and look upwards towards the light with the wall on your right hand side of the screen you can see what it does. Also look at the brick work as you walk towards morgenholts body, you can see it looks like its coming out at ya but isnt at the same time. Which is what parallax mapping does.

Heres an explanation on what tone mapping does.

Tone Mapping is the process of converting the tonal values of an image from a high range to a lower one. For instance, an HDR image with a dynamic range of 100,000:1 will be converted to an 8-bit image (which range is just 255:1).

So if i am correct if you dont have Tone mapping on when you have HDR, you wont actually see any difference from no HDR. As Tone mapping actually translates HDR into something that your monitor can produce.

Soft Shadows are spot on tho.

Have a looky here to compare pics. I have to say mine all look like that!

http://www.beyond3d.com/misc/benchguide/index.php?p=sc3

Funny thing is, i havent seen any performance drop either????
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: hans030390
Originally posted by: munky
Originally posted by: tss4
Originally posted by: munky
Does this mean it will have soft shadows using sm2, and all the sm3 fanboys will have one less argument for their cause?

why are you anti sm3? My understanding was you could do just about everyting in sm2 as you could in sm3, but sm3 was a little faster.

You're right, and I'm not against sm3. BUT, you forgot how for the past year certain people claimed sm3 as the major reason why the gf6 cards were so much better and more future-proof than the x800 series cards, while only a few games utilized sm3, this being one of them. Now that you could get the same eye candy in sm2, there's one less game that requires sm3 to look good. And when the final version of FEAR comes out, we'll see if any of the gf6 cards are fast enough to run the sm3-exclusive eye candy, for all their future-proofness

You are an idiot. Obviously you haven't read anything about SM3 have you? SM3 offers loads of graphical improvements over SM2, but since you dont seem to know anything about it, I'll just leave you to go find out on your own. Or are you too much of a SM2 fan to look at it?

FEAR...um, excuse me, but benchmarks for it were on very high settings, the game was in beta, and perhaps it is coded poorly.

If a 6 series card isnt good enough for next gen, then the x800 series sure as hell isn't, in fact they'd be worse off.

Just thought I'd like you to know. If 6 series goes down, your x800 line does to. So why not get Sm3 if there's even a chance we'll be able to play next gen games at sm3?

And last time i checked, the UE3 developers claimed that a 6600gt would run the engine fine. Maybe not with eye candy, but i'm pretty sure they picked that card (they coulda said an x800 will run it) because of its SM3 capabilities.

You're right though, there are not many SM3 games out. That is what next gen is for. So don't put the lack of SM3 games against SM3. I'll say this: no games...i repeat...NO games out now that use SM3 are using any of its real eye candy features, save a few that are used with SM3 in SC:CT...oh, but there's much more it can do. Example? UE3 engine? almost any next gen game? They used SM3 to get that graphics quality. I dont think it'd be as nice with SM2.

And isn't it funny that every new engine had to be built off of 6800gt's in sli??? Just for the SM3 support.

It seems it will be use alot soon. So i'm not disappointed with getting my 6600gt. Guaranteed to perform well on UE3 games (which i'd assume it would run other engines well too) and has SM3...I'd say i'm more ready for next gen than x800 users are.

Calling me an idiot doesnt make you any less stupid, hans. My x800 is faster than your 6600gt, with or without sm3, so until a game actually requires sm3 to run at all, the x800 will beat the 6600gt almost always. Any pixel shaders are only used for added eye candy, and unless you have a card fast enough to run the game with the eye candy enabled (aka high settings), sm3 will not offer you anything. So, if you end up running FEAR on low settings, chances are you'll have all the pixel shaders disables, and what good is sm3 gonna do for you then? Absolutely nothing.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Matthias99
Originally posted by: Ackmed
Originally posted by: hans030390

It seems it will be use alot soon. So i'm not disappointed with getting my 6600gt. Guaranteed to perform well on UE3 games (which i'd assume it would run other engines well too) and has SM3...I'd say i'm more ready for next gen than x800 users are.

Where is the link that proves the 6600GT will perform well on U3, let a lone games based on the engine? I have serious doubts the card will be able to run it well at all. It may not matter, if its as craptastic as U2. But then again.. I dont consider frames in the teens "well".

There isn't one. There was a comment from Epic saying that UE3 would be scalable enough to run on something like a 6600GT (they may have even said that was their target 'low-end' card for the engine; I'm not sure). There's been nothing said yet about the next Unreal game and its requirements; the only thing we've seen so far is a few clips of Unreal Engine 3.

However, the uber-shiny SM3-enabled demos they were running were getting frames in the teens -- on a 6800U SLI. You're not gonna have graphics like *that* on a 6600GT, SM3.0 support or not. There were some later screenshots released of development work on the next UT game -- and it looked around HL2 quality, but with parallax mapping (and some lighting glitches, but they it was supposedly a very early build).


If there is not one, then he shouldnt say that there is proof of it. I really hope he doesnt think his 6600GT will play U3 "well". Or perhaps his well is slide show.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Drayvn
Actually the HDR looks exactly like the pics ive seen from the 6800 series.

The parallax mapping look exactly like the pics ive seen from the 6800

You are right about the parallax mapping, it just looked more impressive to me back when I first saw it on my Geforce; but it looks the same on the Radeon now. As for the HDR, you are partially right, I'll let some screenshots explain what I mean:

Geforce: No HDR | HDR | HDR + Tone

Radeon: No HDR | HDR | HDR + Tone

Note that HDR looks the same with or without the tone mapping option on the Radeon, unlike the Geforce where there is a very obvious difference between the two.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Let me check it out quickly...

Hmm this is wierd.

For some reason my stuff looks like the 6800 pics. HDR + Tone mapping

But when i get screenies of them they look like the Radeon pics youve got. HDR + Tone mapping

Your right about the tone mapping thing for ATi on and off it doesnt make a difference. But for me, the difference from no HDR to HDR on. Is like the difference for the Geforce you have at HDR to HDR + Tone.

Maybe that might be something as to how ATi did it?