Splinter Cell: Chaos Theory patch

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Originally posted by: munky
Originally posted by: hans030390
Originally posted by: munky
Originally posted by: tss4
Originally posted by: munky
Does this mean it will have soft shadows using sm2, and all the sm3 fanboys will have one less argument for their cause?

why are you anti sm3? My understanding was you could do just about everyting in sm2 as you could in sm3, but sm3 was a little faster.

You're right, and I'm not against sm3. BUT, you forgot how for the past year certain people claimed sm3 as the major reason why the gf6 cards were so much better and more future-proof than the x800 series cards, while only a few games utilized sm3, this being one of them. Now that you could get the same eye candy in sm2, there's one less game that requires sm3 to look good. And when the final version of FEAR comes out, we'll see if any of the gf6 cards are fast enough to run the sm3-exclusive eye candy, for all their future-proofness

You are an idiot. Obviously you haven't read anything about SM3 have you? SM3 offers loads of graphical improvements over SM2, but since you dont seem to know anything about it, I'll just leave you to go find out on your own. Or are you too much of a SM2 fan to look at it?

FEAR...um, excuse me, but benchmarks for it were on very high settings, the game was in beta, and perhaps it is coded poorly.

If a 6 series card isnt good enough for next gen, then the x800 series sure as hell isn't, in fact they'd be worse off.

Just thought I'd like you to know. If 6 series goes down, your x800 line does to. So why not get Sm3 if there's even a chance we'll be able to play next gen games at sm3?

And last time i checked, the UE3 developers claimed that a 6600gt would run the engine fine. Maybe not with eye candy, but i'm pretty sure they picked that card (they coulda said an x800 will run it) because of its SM3 capabilities.

You're right though, there are not many SM3 games out. That is what next gen is for. So don't put the lack of SM3 games against SM3. I'll say this: no games...i repeat...NO games out now that use SM3 are using any of its real eye candy features, save a few that are used with SM3 in SC:CT...oh, but there's much more it can do. Example? UE3 engine? almost any next gen game? They used SM3 to get that graphics quality. I dont think it'd be as nice with SM2.

And isn't it funny that every new engine had to be built off of 6800gt's in sli??? Just for the SM3 support.

It seems it will be use alot soon. So i'm not disappointed with getting my 6600gt. Guaranteed to perform well on UE3 games (which i'd assume it would run other engines well too) and has SM3...I'd say i'm more ready for next gen than x800 users are.

Calling me an idiot doesnt make you any less stupid, hans. My x800 is faster than your 6600gt, with or without sm3, so until a game actually requires sm3 to run at all, the x800 will beat the 6600gt almost always. Any pixel shaders are only used for added eye candy, and unless you have a card fast enough to run the game with the eye candy enabled (aka high settings), sm3 will not offer you anything. So, if you end up running FEAR on low settings, chances are you'll have all the pixel shaders disables, and what good is sm3 gonna do for you then? Absolutely nothing.

If i do have to run it on low settings (which according to the recommended requirements that were released, i wont) at least I'll be able to run it with SM3 anyways, still giving me a better performance boost that if i had to run it with SM2. It is nice to have a bonus. Also, in those FEAR benchies, i remember that running it in SM3 gave it a nice performance boost.

So perhaps I'll be able to run my card at the same settings as you because SM3 will give me a performance boost.

http://www.xbitlabs.com/articles/video/display/msi-6800gt_12.html

A 6800gt beating an x850!!!?!?!?! wow SM3 must be giving a good boost! Too bad they dont offer the sm2 benchmarks for it, or different IQ settings. But the game does look very very playable on 10x7 and even 12x7. Last time i checked, 30fps was quite good. Sorry if you prefer 100 billion fps.

Of course, adding AA and AF kills performance, but its strange how SM3 can still let a 6800gt (not OCed) keep up or beat a x850.

This is what next gen will be like. my 6600gt will probably outperform your x800 with SM3 assuming I set the graphic qualities to roughly the same. Hm. I'd say that's pretty good considering in SM2 games your card beats mine. I was really planning on my card being used for future games. Even if it is on lowest settings.

This is a great article on Sm3. If you read through it, good for you.

http://www.bit-tech.net/hardware/2005/07/25/guide_to_shaders/1.html

What does it say at the end? "there's no sense in buying an ati card at this point..." or something like that.

Clearly choosing a 6600gt for SM3 wasn't a stupid move. Sure, an x800 is a great card for SM2 games, and I am not saying that it will die with next gen. I'm not saying my card will be an uber card either.

Here's a nice list of things SM3 has that Sm2 doesnt (and i know i miss some things):
1. looping instructions (up to 65,356) as compared to SM2 which is under 600 if i'm correct. This just allows much more complex pixel shader work
2. Displacement mapping (research on it, because I won't explain to you. It's one highlight of SM3)
3. (sorry i dont know what these are, but I'll look it up after this) Vertex texture fetch, vertex stream divider, and dynamic flow control.
4. Required fp32 shader precision (as opposed to SM2's 24)
5. Looping and branching shaders

Plus other stuff that i cant remember. But you see, all of that makes next gen games look so good. Sure, maybe the UE3 engine ran at 15fps on 6800gts in sli, but first off, that was very early alpha versions running there. And that was at the highest IQ. There will be plenty of optimization anyways. If you've seen the PS3 demos, I-8, Warhawk, and that boxing one, those were all running in real time on alpha ps3 kits. Sure, its a console, but console's are pretty much PCs now but with slightly different hardware and no OS. The early kit had 6800gts in sli, and an underclocked cell CPU. Anandtech had an article saying that the cell and xbox 360 cpus weren't all powerful like they claim. And last time i checked, I-8, warhawk, and the boxing demo ran nicely for being in development. Plus, for PC games, you always have IQ options.

So its not extremely important to me if I can run next gen with all the eye candy, but I'll be glad knowing i get a good performance boost just using Sm3. Plus, say I run my 6600gt against an x800 in next gen. If i get the same performance when i use Sm3 as an x800 does while using SM2, SM3 will likely look better.

Still, I could be completely wrong about all of this. I doubt that though. Still, i'm done arguing for now because really none of us can make a true conclusion until next gen really hits us.

 

fstime

Diamond Member
Jan 18, 2004
4,382
5
81
You people have been moaning about SM3 for so long now and barely any games support it.

I dont see to many SM3 games out right now.

I would never use Soft shadows, it doesn't allow me to use AF.
I would never use HDR, the 6 series has a pretty big performance drop from it AND it doesn't allow AA.


So all I have to say to you "Nvidia's features pwn ATI" guys, enjoy no AA/AF, and performance drops.

Oh, and keep hoping for more SM3 games.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
So is this game any good? I had been avoiding it because of poor ATi performance. May have to buy it now.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: hans030390
Originally posted by: munky
Originally posted by: hans030390
Originally posted by: munky
Originally posted by: tss4
Originally posted by: munky
Does this mean it will have soft shadows using sm2, and all the sm3 fanboys will have one less argument for their cause?

why are you anti sm3? My understanding was you could do just about everyting in sm2 as you could in sm3, but sm3 was a little faster.

You're right, and I'm not against sm3. BUT, you forgot how for the past year certain people claimed sm3 as the major reason why the gf6 cards were so much better and more future-proof than the x800 series cards, while only a few games utilized sm3, this being one of them. Now that you could get the same eye candy in sm2, there's one less game that requires sm3 to look good. And when the final version of FEAR comes out, we'll see if any of the gf6 cards are fast enough to run the sm3-exclusive eye candy, for all their future-proofness

You are an idiot. Obviously you haven't read anything about SM3 have you? SM3 offers loads of graphical improvements over SM2, but since you dont seem to know anything about it, I'll just leave you to go find out on your own. Or are you too much of a SM2 fan to look at it?

FEAR...um, excuse me, but benchmarks for it were on very high settings, the game was in beta, and perhaps it is coded poorly.

If a 6 series card isnt good enough for next gen, then the x800 series sure as hell isn't, in fact they'd be worse off.

Just thought I'd like you to know. If 6 series goes down, your x800 line does to. So why not get Sm3 if there's even a chance we'll be able to play next gen games at sm3?

And last time i checked, the UE3 developers claimed that a 6600gt would run the engine fine. Maybe not with eye candy, but i'm pretty sure they picked that card (they coulda said an x800 will run it) because of its SM3 capabilities.

You're right though, there are not many SM3 games out. That is what next gen is for. So don't put the lack of SM3 games against SM3. I'll say this: no games...i repeat...NO games out now that use SM3 are using any of its real eye candy features, save a few that are used with SM3 in SC:CT...oh, but there's much more it can do. Example? UE3 engine? almost any next gen game? They used SM3 to get that graphics quality. I dont think it'd be as nice with SM2.

And isn't it funny that every new engine had to be built off of 6800gt's in sli??? Just for the SM3 support.

It seems it will be use alot soon. So i'm not disappointed with getting my 6600gt. Guaranteed to perform well on UE3 games (which i'd assume it would run other engines well too) and has SM3...I'd say i'm more ready for next gen than x800 users are.

Calling me an idiot doesnt make you any less stupid, hans. My x800 is faster than your 6600gt, with or without sm3, so until a game actually requires sm3 to run at all, the x800 will beat the 6600gt almost always. Any pixel shaders are only used for added eye candy, and unless you have a card fast enough to run the game with the eye candy enabled (aka high settings), sm3 will not offer you anything. So, if you end up running FEAR on low settings, chances are you'll have all the pixel shaders disables, and what good is sm3 gonna do for you then? Absolutely nothing.

If i do have to run it on low settings (which according to the recommended requirements that were released, i wont) at least I'll be able to run it with SM3 anyways, still giving me a better performance boost that if i had to run it with SM2. It is nice to have a bonus. Also, in those FEAR benchies, i remember that running it in SM3 gave it a nice performance boost.

So perhaps I'll be able to run my card at the same settings as you because SM3 will give me a performance boost.

http://www.xbitlabs.com/articles/video/display/msi-6800gt_12.html

A 6800gt beating an x850!!!?!?!?! wow SM3 must be giving a good boost! Too bad they dont offer the sm2 benchmarks for it, or different IQ settings. But the game does look very very playable on 10x7 and even 12x7. Last time i checked, 30fps was quite good. Sorry if you prefer 100 billion fps.

Of course, adding AA and AF kills performance, but its strange how SM3 can still let a 6800gt (not OCed) keep up or beat a x850.

This is what next gen will be like. my 6600gt will probably outperform your x800 with SM3 assuming I set the graphic qualities to roughly the same. Hm. I'd say that's pretty good considering in SM2 games your card beats mine. I was really planning on my card being used for future games. Even if it is on lowest settings.

This is a great article on Sm3. If you read through it, good for you.

http://www.bit-tech.net/hardware/2005/07/25/guide_to_shaders/1.html

What does it say at the end? "there's no sense in buying an ati card at this point..." or something like that.

Clearly choosing a 6600gt for SM3 wasn't a stupid move. Sure, an x800 is a great card for SM2 games, and I am not saying that it will die with next gen. I'm not saying my card will be an uber card either.

Here's a nice list of things SM3 has that Sm2 doesnt (and i know i miss some things):
1. looping instructions (up to 65,356) as compared to SM2 which is under 600 if i'm correct. This just allows much more complex pixel shader work
2. Displacement mapping (research on it, because I won't explain to you. It's one highlight of SM3)
3. (sorry i dont know what these are, but I'll look it up after this) Vertex texture fetch, vertex stream divider, and dynamic flow control.
4. Required fp32 shader precision (as opposed to SM2's 24)
5. Looping and branching shaders

Plus other stuff that i cant remember. But you see, all of that makes next gen games look so good. Sure, maybe the UE3 engine ran at 15fps on 6800gts in sli, but first off, that was very early alpha versions running there. And that was at the highest IQ. There will be plenty of optimization anyways. If you've seen the PS3 demos, I-8, Warhawk, and that boxing one, those were all running in real time on alpha ps3 kits. Sure, its a console, but console's are pretty much PCs now but with slightly different hardware and no OS. The early kit had 6800gts in sli, and an underclocked cell CPU. Anandtech had an article saying that the cell and xbox 360 cpus weren't all powerful like they claim. And last time i checked, I-8, warhawk, and the boxing demo ran nicely for being in development. Plus, for PC games, you always have IQ options.

So its not extremely important to me if I can run next gen with all the eye candy, but I'll be glad knowing i get a good performance boost just using Sm3. Plus, say I run my 6600gt against an x800 in next gen. If i get the same performance when i use Sm3 as an x800 does while using SM2, SM3 will likely look better.

Still, I could be completely wrong about all of this. I doubt that though. Still, i'm done arguing for now because really none of us can make a true conclusion until next gen really hits us.

Ok, then we'll just have to wait for FEAR to be released or some demo, and put my x800 against your 6600gt. The thing I hope you realize is that if you disable the eye candy and use lower settings, then the game wont be running SM anything. Not sm3, not sm2, not sm1. Just plain old T&L, no shaders. Well, actually, it might fall back to sm1.4, like BF2, that's why sm1.4 is the minimum requirement for that game. But the point is you need to have high quality settings enabled in order to use sm3 or sm2. Those pixel shaders is what's used most often for high quality visual effects.
 

dornick

Senior member
Jan 30, 2005
751
0
0
Originally posted by: TheSnowman
Originally posted by: Drayvn
Actually the HDR looks exactly like the pics ive seen from the 6800 series.

The parallax mapping look exactly like the pics ive seen from the 6800

You are right about the parallax mapping, it just looked more impressive to me back when I first saw it on my Geforce; but it looks the same on the Radeon now. As for the HDR, you are partially right, I'll let some screenshots explain what I mean:

Geforce: No HDR | HDR | HDR + Tone

Radeon: No HDR | HDR | HDR + Tone

Note that HDR looks the same with or without the tone mapping option on the Radeon, unlike the Geforce where there is a very obvious difference between the two.

Call me blind, but I see no drastic differences in any of those screenshots. The only thing different to me is a bit more light reflecting off the rocks with HDR.

This is why I can't get worked up over SM3. I honestly dont see any difference in the images. Does anyone have a link to screenshots where there is a clear difference?
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
The tone-maping adds some brightness from the HDR to the rest of the view when it is enabled on the Geforce, but you really have to see it in motion to aprecate it. One interesting thing I found is that when I use 3danalyze to force a radeon device ID on my Geforce and get the SM2 path, tone-maping looks the same as with SM3.
 

Powermoloch

Lifer
Jul 5, 2005
10,084
4
76
This thread is funny, there are no games out yet that really takes advantage of SM3. And who would think to put all the eye candy on and sacrifice FPS for playability, ridiculous (except for the high end cards like 7800GTX and what not).
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: BouZouki
You people have been moaning about SM3 for so long now and barely any games support it.

I dont see to many SM3 games out right now.

I would never use Soft shadows, it doesn't allow me to use AF.
I would never use HDR, the 6 series has a pretty big performance drop from it AND it doesn't allow AA.


So all I have to say to you "Nvidia's features pwn ATI" guys, enjoy no AA/AF, and performance drops.

Oh, and keep hoping for more SM3 games.

When did Soft Shadows disable AF?

SC:CT on SM2 still lets me use 16x AF....

 

ironique

Senior member
May 16, 2002
498
0
76
SM2.0 Support, finally! Now that they've decided to stop telling us "the way it's meant to be played", it's time to "get in the game"!
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Hey, can anyone with a x800xl or similar card tell me what kind of performance hit you get when enabling HDR using sm2? Also, anyone with a 6800gt, what kind of performance hit do you get? I'd like to see which method runs more efficiently, the one using FP buffers, or the one using pixel shaders.
 

Randum

Platinum Member
Jan 28, 2004
2,473
0
76
6000 outperforming a x800 is ridiculous. I'm not saying the 9800 will crush a 6800 lol (everyone knows the 9 is higher than 6 right?)

Anyhow, at the end of the day everyone prefers dif gaming settings, I know personally I'm pretty anal with a nice balance between details and FPS, if FPS is too low...I get a new card! simple as that. Nvidia or ATI, whichever. I have been on an ATI kick lately but the 7800 looks pretty good!

so chaos theory is a good game? Worth getting?