http://www.firingsquad.com/hardware/splinter_cell_chaos_theory_1/default.asp
A good article, X800s faster, but running at lowly SM1.1 and reduced IQ.
A good article, X800s faster, but running at lowly SM1.1 and reduced IQ.
Unfortunately, Ubisoft has decided not to provide a 2.0 or 2.0b shader mode for RADEON 9500 (or greater) and X800 users.
---
In any case, it?s very disappointing to see a developer provide so many features for one particular brand and not another. Eye candy features such as parallax mapping could have been thrown in for ATI users if a 2.0 mode would have been provided by Ubisoft.
Clearly shader model 3.0 is the way of the future, we don?t dispute that. But with the large amount of 2.0-capable hardware out there, it seems strange to skip this mode entirely. Fortunately the 1.1 mode still looks quite good, as we?ve shown you in our screenshots, but it?s never a good thing for gamers when one group of users receives preferential treatment ahead of others. CryTek set a beautiful example of how it should be done, providing additional features for both ATI and NVIDIA card owners in follow-up patches to Far Cry. Hopefully Ubi will get on the boat and do the same.
Originally posted by: Noob
Too bad the game don't support SM 2.0b. Us X800 users would have got it's performance increases and IQ as SM 3.0 (HDR pending). It was probably a behind-the-scenes deal to promote 6800's not to include support for SM 2.0b. Oh well, the game still looks good and is good. I'm on the 5th mission. Somewhat complicated storyline.
Originally posted by: Rollo
nVidia didn't "cheat" anyone out of anything- the developer of the game opted to use the MS standard?
Originally posted by: RolloIt's not their fault ATI hasn't put an SM3 card on the market yet, or that some people have opted not buy SM3 cards.
Originally posted by: RolloIn any case, I don't see how posting a link to a good article that examines the differences between SM1.1 and SM 3 in this popular new game is "patting nV on the back"?
Originally posted by: RolloYou could as easily say I'm flaming nV, the benchmarks show the X800 cards winning the benchmarks, albeit at reduced IQ.
Originally posted by: Sylvanas
Ati will be doing HDR in the near future with HL2 ....
Originally posted by: Rollo
Snowman:
It's not Ubis responsibility to retro code for ATIs way behind the times hardware. It's up to ATI to keep up with the current standard. SM3 has been the standard for about a year now?
Originally posted by: Rollo
I think it's a little unrealistic to expect them not to use HDR, soft shadows, etc..
Originally posted by: TheSnowman
Originally posted by: Rollo
nVidia didn't "cheat" anyone out of anything- the developer of the game opted to use the MS standard?
Nvidia uses their "developer relations" to weasel game makers into unaturally showing Nvidia cards in a favorable light or hideing their faults all the time. From EA games spurt not alowing Radeons to run 1600x1200 to Edios pulling the benchmarking ablities of Tomb Raider with a patch becuase the FX cards sucked with PS2.0. Considering UBI had previously mentioned SM2 support for Splinter Cell 3, it only goes to figure that Nvidia convinced them to leave it out. And MS "standard" applies to lower shader models as well.
ATi's cards are not the only ones which dont do 3.0. All NV's cards before the 6x series do not have it as well. What does that say? To me it says that they dont care about their customers
Originally posted by: rbV5
ATi's cards are not the only ones which dont do 3.0. All NV's cards before the 6x series do not have it as well. What does that say? To me it says that they dont care about their customers
Actually "lowly" 1.1 PS plays well for FX as well. They couldn't ask for better shader support than 1.1 and 3.0 respectively for their product line in game titles. I'm quite sure Nvidia would like to see SM2.0x die.
I truly hope the R520 is available in 3 months, and why the heck would I argue against it?Whats going to be funny to me is, you're going to have to get another argument in a few months, about how you say ATi uses 3 year old tech
But I could always use more- feel free to PayPal it to the address in my profile.Rollo has lots of money, it seems.
Originally posted by: rbV5
ATi's cards are not the only ones which dont do 3.0. All NV's cards before the 6x series do not have it as well. What does that say? To me it says that they dont care about their customers
Actually "lowly" 1.1 PS plays well for FX as well. They couldn't ask for better shader support than 1.1 and 3.0 respectively for their product line in game titles. I'm quite sure Nvidia would like to see SM2.0x die.
Originally posted by: Ackmed
Originally posted by: rbV5
ATi's cards are not the only ones which dont do 3.0. All NV's cards before the 6x series do not have it as well. What does that say? To me it says that they dont care about their customers
Actually "lowly" 1.1 PS plays well for FX as well. They couldn't ask for better shader support than 1.1 and 3.0 respectively for their product line in game titles. I'm quite sure Nvidia would like to see SM2.0x die.
I agree with this, and something I thought about as to why they didnt add it. I didnt want to say it, because I knew a few would go off on a rant about me saying it. FX cards suck at PS2.0, but they still should have added it.
Saying a X800 series card is a bad buy for 2004/2005 is just silly. Buying a X800XL for under $300 when its virtually the same speed as a 6800GT is not a bad buy, considering how much cheaper it is. How many titles has SM3.0 really effected, what percentage of overall games? How many were done on purpose?
Please dont try to act as if you dont favor NV Rollo, its laughable.