FS Splinter Cell Chaos Theory SM3 Article

imported_Noob

Senior member
Dec 4, 2004
812
0
0
Too bad the game don't support SM 2.0b. Us X800 users would have got it's performance increases and IQ as SM 3.0 (HDR pending). It was probably a behind-the-scenes deal to promote 6800's not to include support for SM 2.0b. Oh well, the game still looks good and is good. I'm on the 5th mission. Somewhat complicated storyline.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Damn Rollo, why do you feel compelled to pat Nvidia on the back for one-upping their competition worming developers into cheating end users out of features to make themselves look better instead of simply playing on their own strenghts? At least FS didn't let it slide without mention:

Unfortunately, Ubisoft has decided not to provide a 2.0 or 2.0b shader mode for RADEON 9500 (or greater) and X800 users.
---
In any case, it?s very disappointing to see a developer provide so many features for one particular brand and not another. Eye candy features such as parallax mapping could have been thrown in for ATI users if a 2.0 mode would have been provided by Ubisoft.

Clearly shader model 3.0 is the way of the future, we don?t dispute that. But with the large amount of 2.0-capable hardware out there, it seems strange to skip this mode entirely. Fortunately the 1.1 mode still looks quite good, as we?ve shown you in our screenshots, but it?s never a good thing for gamers when one group of users receives preferential treatment ahead of others. CryTek set a beautiful example of how it should be done, providing additional features for both ATI and NVIDIA card owners in follow-up patches to Far Cry. Hopefully Ubi will get on the boat and do the same.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
nVidia didn't "cheat" anyone out of anything- the developer of the game opted to use the MS standard?

It's not their fault ATI hasn't put an SM3 card on the market yet, or that some people have opted not buy SM3 cards.

In any case, I don't see how posting a link to a good article that examines the differences between SM1.1 and SM 3 in this popular new game is "patting nV on the back"?

You could as easily say I'm flaming nV, the benchmarks show the X800 cards winning the benchmarks, albeit at reduced IQ.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Noob
Too bad the game don't support SM 2.0b. Us X800 users would have got it's performance increases and IQ as SM 3.0 (HDR pending). It was probably a behind-the-scenes deal to promote 6800's not to include support for SM 2.0b. Oh well, the game still looks good and is good. I'm on the 5th mission. Somewhat complicated storyline.

Actually there is *no* Radeons that can do the HDR which is used in SC3, as like Far Cry's HDR it requires a floating point framebuffer. That is the same reason HDR doesn't work with AA in either of those games, The Geforce6 cards floating point framebuffer ablities do not support AA. You are right that SM3 isn't nesscary for it though.

Oh and I'm on the 5th mission too. :)
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Ati will be doing HDR in the near future with HL2 (sorry cant find article) wow lets all pat Nvidia on the back because they can do SM3 on this game and Ati cant......
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Rollo
nVidia didn't "cheat" anyone out of anything- the developer of the game opted to use the MS standard?

Nvidia uses their "developer relations" to weasel game makers into unaturally showing Nvidia cards in a favorable light or hideing their faults all the time. From EA games spurt not alowing Radeons to run 1600x1200 to Edios pulling the benchmarking ablities of Tomb Raider with a patch becuase the FX cards sucked with PS2.0. Considering UBI had previously mentioned SM2 support for Splinter Cell 3, it only goes to figure that Nvidia convinced them to leave it out. And MS "standard" applies to lower shader models as well.

Originally posted by: RolloIt's not their fault ATI hasn't put an SM3 card on the market yet, or that some people have opted not buy SM3 cards.

And it wasn't ATI's fault that Nvidia passed up PS1.4 with all the way though the Geforce 4 cards, but you didn't see ATI buying off developers to make their games straight PS1.4 or fixed function.

Originally posted by: RolloIn any case, I don't see how posting a link to a good article that examines the differences between SM1.1 and SM 3 in this popular new game is "patting nV on the back"?

I'm sure you don't.

Originally posted by: RolloYou could as easily say I'm flaming nV, the benchmarks show the X800 cards winning the benchmarks, albeit at reduced IQ.

I didn't notice any terms like "lowly" directed at Nvidia.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Sylvanas
Ati will be doing HDR in the near future with HL2 ....

Yeah, they will, and Nvidia as well; But it the HDR technique that will be applied to HL2 is more limited in functionality than what is used in SC3 and FC.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Snowman:

It's not Ubis responsibility to retro code for ATIs way behind the times hardware. It's up to ATI to keep up with the current standard. SM3 has been the standard for about a year now?

People who chose X800 cards for the last year knew there would be games with nV only features they would not be able to use, and made that choice.

I honestly don't see what you're upset about here, this is like the owner of a 2WD truck complaining it's not 4WD.

All you have to do is look at the nV40 testimonial page on nVidia's website, about every major developer out there is on it stating how the nV40 gave them the features they needed for their current work. I think it's a little unrealistic to expect them not to use HDR, soft shadows, etc..
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Rollo
Snowman:

It's not Ubis responsibility to retro code for ATIs way behind the times hardware. It's up to ATI to keep up with the current standard. SM3 has been the standard for about a year now?

And 1.1 isnt "retro coding"? Once again, you dont make any sense. There are other standards. If they can code for 1.1, they can sure code for 2.0/b. There are FAR more cards on the market (NV&ATi owners) that support 2.0/b than 3.0. It only makes sense to support them.

ATi's cards are not the only ones which dont do 3.0. All NV's cards before the 6x series do not have it as well. What does that say? To me it says that they dont care about their customers.

Sadly, as Snowman has pointed out, this isnt the first time its happened. The Tomb Raider patch was really funny, and a pathetic attempt for NV to hide the poor performance of its hardware.

Whats going to be funny to me is, you're going to have to get another argument in a few months, about how you say ATi uses 3 year old tech.

 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Rollo
I think it's a little unrealistic to expect them not to use HDR, soft shadows, etc..

But who is expecting that? I know I like seeing all the new features of my 6800gt put to good use.
 
Jun 14, 2003
10,442
0
0
Originally posted by: TheSnowman
Originally posted by: Rollo
nVidia didn't "cheat" anyone out of anything- the developer of the game opted to use the MS standard?

Nvidia uses their "developer relations" to weasel game makers into unaturally showing Nvidia cards in a favorable light or hideing their faults all the time. From EA games spurt not alowing Radeons to run 1600x1200 to Edios pulling the benchmarking ablities of Tomb Raider with a patch becuase the FX cards sucked with PS2.0. Considering UBI had previously mentioned SM2 support for Splinter Cell 3, it only goes to figure that Nvidia convinced them to leave it out. And MS "standard" applies to lower shader models as well.


and i like the way that, even tough Nvidia have got their mits on the developers decisions, ATi still do better in some games with the fabled TWIWMTBP moniker.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
What is to like? In the case of Splinter Cell 3; everyone with a DX9 Radeon or a GeforceFX could have all the extra image quality that the Geforce6 gets aside from HDR, but most likely the new Radeons would top the Geforce6s in benchmarks of SM2 vs SM3 in Splinter Cell like they do when in Far Cry, so we wind up with no one getting to use SM2 instead. I can't see anything to like about that regardless of the fact that I own a NV40.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Rollo has lots of money, it seems.
He laughs at anyone not able to buy the latest and greatest hardware.
For us "lowly" people with perfectly able SM2 hardware (I'm talking Radeon 9800 type cards here, and, if they were any good with SM2, 5x00 from nVidia), we shouldn't be allowed to use out cards at improved IQ in Splinter Cell. Apparently. Because they are ancient, and like Rollo, we should spend every penny we can buying new cards.

Now, from an earlier thread, Rollo had a similar complaint, because someone said SLI should support Windows 2000. He then was not against the use of the Steam survey to prove his point that XP was the main OS of choice.
From that same survey, a perfectly capable card, the Radeon 9800, comes out as the most popular graphics card.
So why didn't Ubi but in SM2 support for Splinter Cell?
So yes, many users WERE cheated out of something, if you look at the number of SM2 capable cards compared to the number of SM3 cards out there.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
ATi's cards are not the only ones which dont do 3.0. All NV's cards before the 6x series do not have it as well. What does that say? To me it says that they dont care about their customers

Actually "lowly" 1.1 PS plays well for FX as well. They couldn't ask for better shader support than 1.1 and 3.0 respectively for their product line in game titles. I'm quite sure Nvidia would like to see SM2.0x die.
 
Jun 14, 2003
10,442
0
0
Originally posted by: rbV5
ATi's cards are not the only ones which dont do 3.0. All NV's cards before the 6x series do not have it as well. What does that say? To me it says that they dont care about their customers

Actually "lowly" 1.1 PS plays well for FX as well. They couldn't ask for better shader support than 1.1 and 3.0 respectively for their product line in game titles. I'm quite sure Nvidia would like to see SM2.0x die.


probably, if they did have a 2.0 path, i can image alot of uneducated FX users trying to run that 2.0 path and being horribly disappointed....this will then que mass emails to ubi and NV asking wtf is going on etc etc

seems simpler for NV at least to have 1.1 and 3.0

their latest cards can just to say handle 3.0 decently fast, and that leaves the HX series to enjoy 1.1 with good performance too, totally forgetting about SM2.0 because they know it will be a headache for them
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
The GPU/3D game industry is getting more and more like politics as time goes by...

 
Jun 14, 2003
10,442
0
0
has any one else with the game been able to replicate the results FS got?

now i know im using the 76.41's but, at 12x10, 2xaa 4xaf using the SM3 path, but with the HDR, soft shadows and parallax mapping turned OFF i manage 38fps average, my GT is clocked to ultra speeds, so that result is 3fps shy off a real ultra, but in their tests they use 4xaa and 8xaf

may have to switch back to the 71.84's
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Sigh.

Complaining about this is like complaining that HL2 runs better on ATI cards because of Valves relationship with ATI.

There always have been, and probably always will be, differences in games and the hardware that runs them.

3dfx had Glide, and for a long while if you didn't have a 3dfx card, you didn't play the games nearly as well.

S3 had MeTal, that offered additional functionality.

ATI had True-Form.

The fact of the matter is that there will likely always be vendor specific features because companies strrive for differentiation. It's up to you as a consumer to choose which company supports the features you feel will be important.

I have been telling people for a year that for this generation the nV40 cards are the way to go because there would be trade-offs in image quality, or just the ability to see what some new features look like. If you didn't listen and bought an ATI card and posted in reply,"No way Rollo! NO vendor would ever alienate all the ATI owners!"- just try and fire up some HDR in Far Cry or Splinter Cell, SM3 in Splinter Cell, soft shadows in Riddick or Splinter Cell, and think about your answer again.

I tried to tell you, and still do try to tell you that the X800 series is a bad buy in 2004/2005 because all you get for your choice is marginally faster framerates at some games.

Ackmed:
Whats going to be funny to me is, you're going to have to get another argument in a few months, about how you say ATi uses 3 year old tech
I truly hope the R520 is available in 3 months, and why the heck would I argue against it?
I am all for advancement of the industry, and choice, and don't really care who produces it. Optimally, both companies offer similar functionality as competition benefits us all as consumers.
I don't think there is anyone here who can say they've used/supported both companies more equitably than me. If you need me to reel off how I've bought and used ATIs flagship card for every generation for the last decade I'd be happy to?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Lonyo:
Rollo has lots of money, it seems.
But I could always use more- feel free to PayPal it to the address in my profile.

I'm a middle class guy, by no means "wealthy".
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: rbV5
ATi's cards are not the only ones which dont do 3.0. All NV's cards before the 6x series do not have it as well. What does that say? To me it says that they dont care about their customers

Actually "lowly" 1.1 PS plays well for FX as well. They couldn't ask for better shader support than 1.1 and 3.0 respectively for their product line in game titles. I'm quite sure Nvidia would like to see SM2.0x die.


I agree with this, and something I thought about as to why they didnt add it. I didnt want to say it, because I knew a few would go off on a rant about me saying it. FX cards suck at PS2.0, but they still should have added it.

Saying a X800 series card is a bad buy for 2004/2005 is just silly. Buying a X800XL for under $300 when its virtually the same speed as a 6800GT is not a bad buy, considering how much cheaper it is. How many titles has SM3.0 really effected, what percentage of overall games? How many were done on purpose?

Please dont try to act as if you dont favor NV Rollo, its laughable.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,045
32,547
146
I have read members here state SM3 is easier to code for, could that have played a major part in the decision?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Ackmed
Originally posted by: rbV5
ATi's cards are not the only ones which dont do 3.0. All NV's cards before the 6x series do not have it as well. What does that say? To me it says that they dont care about their customers

Actually "lowly" 1.1 PS plays well for FX as well. They couldn't ask for better shader support than 1.1 and 3.0 respectively for their product line in game titles. I'm quite sure Nvidia would like to see SM2.0x die.


I agree with this, and something I thought about as to why they didnt add it. I didnt want to say it, because I knew a few would go off on a rant about me saying it. FX cards suck at PS2.0, but they still should have added it.

Saying a X800 series card is a bad buy for 2004/2005 is just silly. Buying a X800XL for under $300 when its virtually the same speed as a 6800GT is not a bad buy, considering how much cheaper it is. How many titles has SM3.0 really effected, what percentage of overall games? How many were done on purpose?

Please dont try to act as if you dont favor NV Rollo, its laughable.

And I feel that it is you that is laughable to give anyone a hard time whether he/she favors one or the other. Everyone, yes even you Ackmed, is allowed to have a preference and is also allowed to change their preference on a whim as things always do tend to change from gen to gen.

I rarely, if ever, see you give even a self-proclaimed ATI fan a hard time. Why is that?

 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
I never said I didnt favor ATi, I do. I have given a "ATI fan" a hard time before as well when posting idiotic comments. Too much once, and almost got myself banned.

I didnt turn this into a ATi vs. NV thing. I turned my focus on devs, and their bad decisions that screw over their customers.

I have yet to hear one good reason for them to choose 1.1, and 3.0, but no 2.0. You have one? Rollos reason is they shouldnt have to "retro code" for 2.0, yet 1.1 is far more retro than 2.0, so that makes zero sense. Neither does screwing over a larger user base, since far more people have 2.0 cards, than 3.0 cards, for NV and ATi.

It doesnt really matter to me, its not my kind of game. I try to like it, just cant. The only slow'ish shooter I like is MGS. Its just sad to see devs doing this, and what is probably going to happen more and more.