• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

FS Splinter Cell Chaos Theory SM3 Article

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Technically, the FX series used SM2.0 correctly (more on the higher end cards though)

Do you know why? OK, half life 2 runs on DX8.1 with an FX card right? If you enable DX9, you get artifacts and bad IQ, right? Well, there is this program that allows you to change the config files (it does it for you) to tell HL2 it is running as an ATI 9800, and to run it in DX9 (sm2)
Do you know what happens? HL2 magically runs WELL in DM9/SM2 with an FX card and there are NO graphical problems...

So dont put the FX series and their "bad" SM2 support...i think there is something else behind it (unless you can clearly explain why the above happened? oh, and it happened to alot of people)

As for SC:ct, it is strange that the game runs WORSE in SM1.1 on the 6800...Sm3 is just more effecient, but you can tell that when you add the real eye candy that it supports, then SM3 becomes a little to much.
 
bit of info......SC:CT it seems doesnt like FRAPS. i ran the timedemo FRAPS told me 20FPS (and it looked it) while SC's fps counter was telling me 40 odd

the results came back as an average of about 22, i turned fraps off ran it again, and my average is nearly 50fps (this is with SM3 enabled but HDR, Parallax mapping and soft shadows turned off), and game play is supersmooth now at 12x10, 8xaf, SM3 + all the trimmings (everything on)
 
Originally posted by: Ackmed
I never said I didnt favor ATi, I do. I have given a "ATI fan" a hard time before as well when posting idiotic comments. Too much once, and almost got myself banned.

I didnt turn this into a ATi vs. NV thing. I turned my focus on devs, and their bad decisions that screw over their customers.

I have yet to hear one good reason for them to choose 1.1, and 3.0, but no 2.0. You have one? Rollos reason is they shouldnt have to "retro code" for 2.0, yet 1.1 is far more retro than 2.0, so that makes zero sense. Neither does screwing over a larger user base, since far more people have 2.0 cards, than 3.0 cards, for NV and ATi.

It doesnt really matter to me, its not my kind of game. I try to like it, just cant. The only slow'ish shooter I like is MGS. Its just sad to see devs doing this, and what is probably going to happen more and more.

Ok, well how about this for a possible explaination. Devs found it easier by far to code for 1.1 and 3.0 combined over 2.0? Could be. Maybe 2.0 really is a PITA to code for. Hence the slow stagnant speed of emerging 2.0 titles since 2.0 came to be. And they took the easy way out? I think this could be a likely candidate.

 
Originally posted by: hans030390
So dont put the FX series and their "bad" SM2 support...i think there is something else behind it (unless you can clearly explain why the above happened? oh, and it happened to alot of people)

Show me where what you stated happend, and I might be able to explain why. But from all I know, forceing DX9 on a FX card makes framerates tank.
 
Originally posted by: Ackmed
I never said I didnt favor ATi, I do. I have given a "ATI fan" a hard time before as well when posting idiotic comments. Too much once, and almost got myself banned.

I didnt turn this into a ATi vs. NV thing. I turned my focus on devs, and their bad decisions that screw over their customers.

I have yet to hear one good reason for them to choose 1.1, and 3.0, but no 2.0. You have one? Rollos reason is they shouldnt have to "retro code" for 2.0, yet 1.1 is far more retro than 2.0, so that makes zero sense. Neither does screwing over a larger user base, since far more people have 2.0 cards, than 3.0 cards, for NV and ATi.

It doesnt really matter to me, its not my kind of game. I try to like it, just cant. The only slow'ish shooter I like is MGS. Its just sad to see devs doing this, and what is probably going to happen more and more.

They didnt want to code 3 paths for the game?

Theres the "up to date" feature sets, and "other".

"Other" is Geforce 3 and forward, all PS1.1, they probably didnt want the additional cost and dev time of cranking out 2.0 or 2.0b.
 
Originally posted by: keysplayr2003
Ok, well how about this for a possible explaination. Devs found it easier by far to code for 1.1 and 3.0 combined over 2.0? Could be. Maybe 2.0 really is a PITA to code for. Hence the slow stagnant speed of emerging 2.0 titles since 2.0 came to be. And they took the easy way out? I think this could be a likely candidate.
They could "take the easy way out" and still give SM2 users improved image quality simply by letting such cards run with floating point presision on the shaders. No extra codeing needed there, just let the cards do what they can. However, I think the obvious explantion for why they didn't is the same explantion for why the game is plastered with random product adds; $.
 
Snowman - The article you paraphrased in itself stated, "Clearly shader model 3.0 is the way of the future, we don?t dispute that", then why would accuse Ubisoft (one of the pioneers in gaming development), of unfair business practices? Maybe they are just trying to get the competition to wake up?

This is the exact thing which happened when it was the FX's vs the 9700's when FX's shader performance were not worth a damn. Nvidia had to go out of their own way to make driver revision after driver revision to help performance in a few games.

It's just how it is. ATi's SM3 video cards are just around the corner anyway. Well, so they say.

 
Hmm nice review, I still dont see why people say the 6800GT is faster than the x800XL whe the xl is either winning by 1-2 fps or neck and neck with it.
 
Originally posted by: Regs
Snowman - The article you paraphrased in itself stated, "Clearly shader model 3.0 is the way of the future, we don?t dispute that", then why would accuse Ubisoft (one of the pioneers in gaming development), of unfair business practices? Maybe they are just trying to get the competition to wake up

This is the exact thing which happened when it was the FX's vs the 9700's when FX's shader performance were not worth a damn. Nvidia had to go out of their own way to make driver revision after driver revision to help performance in a few games.

You seem to be confused if you think driver revisions can make SM2 hardware run SM3.

Originally posted by: RegsIt's just how it is. ATi's SM3 video cards are just around the corner anyway. Well, so they say.

That doesn't help all the SM2 cards out there though.
 
Hmm nice review, I still dont see why people say the 6800GT is faster than the x800XL whe the xl is either winning by 1-2 fps or neck and neck with it.

Yes, in inferior quality mode, it is faster.

 
...which clearly leaves the 6800gt as a better choice for those looking to offset and inferiority complex. If UBI had released a SM2 path which provided much of the same image quality enhancements of th SM3 path while retaining the relative performance between ATI and Nvidia cards; it wouldn't be nearly as satisfying to such people.
 
Originally posted by: Regs
Snowman - The article you paraphrased in itself stated, "Clearly shader model 3.0 is the way of the future, we don?t dispute that", then why would accuse Ubisoft (one of the pioneers in gaming development), of unfair business practices? Maybe they are just trying to get the competition to wake up?
You mean, "give the competition" (other game software houses) "a leg up" (on them). If they chose to not support a large portion of the current higher-end gaming customer base, their competitors will, and those competitors will then likely sell more games, at least while there is a market for games written for SM2.0-capable hardware.

If you are suggesting that somehow a game software house coding using the SM3.0 APIs, are "competition" for a certain video card hardware company that sell only pre-SM3.0 hardware at this point, then you are implicitly admitting that Ubisoft is clearly working for NV then.
 
Originally posted by: TheSnowman
You seem to be confused if you think driver revisions can make SM2 hardware run SM3.
I wonder just what is possible, in terms of potential enhancements to MS's DirectX HLSL compilier, in terms of being able to "run" SM3.0 code on SM2.0x hardware, by unrolling loops and branches. Granted, I can't imagine that it would run at comparable speed, and would likely require multiple times the offscreen video memory buffer space. It could produce similar visual results, but it is still best if the game designers themselves special-case the codepaths.
 
TheSnowman, I'm sorry i can't find you the link, it was on some forums...ill keep looking for it and post it when i find it
 
Big win for Nvidia here. Will be interesting to see if Ubisoft releases patches to improve performance for sm 2.0 cards - or if 6800 series cards produce enough sales. :beer:
 
Originally posted by: VirtualLarry
Originally posted by: TheSnowman
You seem to be confused if you think driver revisions can make SM2 hardware run SM3.
I wonder just what is possible, in terms of potential enhancements to MS's DirectX HLSL compilier, in terms of being able to "run" SM3.0 code on SM2.0x hardware, by unrolling loops and branches. Granted, I can't imagine that it would run at comparable speed, and would likely require multiple times the offscreen video memory buffer space. It could produce similar visual results, but it is still best if the game designers themselves special-case the codepaths.

Well technically you could just do it all in software on the CPU, but MS requires full hardware support of a shader model in DirectX.
 
Ackmed:
Saying a X800 series card is a bad buy for 2004/2005 is just silly. Buying a X800XL for under $300 when its virtually the same speed as a 6800GT is not a bad buy, considering how much cheaper it is. How many titles has SM3.0 really effected, what percentage of overall games? How many were done on purpose?
Well, when you consider things like no 16X12 on EA games, no soft shadows on Riddick or SCCT, no SM3 for SCCT, no HDR for Far Cry, and who knows what else upcoming- the $300 X800XL looks like less and less of a deal over the $375 6800GT. (which can also be SLI'd to last even longer)

Please dont try to act as if you dont favor NV Rollo, its laughable.
Yep, I favor nVidia these days and make no secret of it. When the R520 comes out, I may favor ATI again, or I may not if I deem nVidia's card a better deal. I'm wacky that way!
😉
 
Originally posted by: TheSnowman

You seem to be confused if you think driver revisions can make SM2 hardware run SM3.

I didn't emply that. Nvidia's driver rehash's did not solve shader performance, but redirected the problem from the performance hit.
 
Originally posted by: Rollo
http://www.firingsquad.com/hardware/splinter_cell_chaos_theory_1/default.asp

A good article, X800s faster, but running at lowly SM1.1 and reduced IQ.


Am I missing something here? The IQ looks very similar. The only advantage to having HDR is the lights seem brighter.

This SM 3.0 and HDR stuff is being overblown.

the $300 X800XL looks like less and less of a deal over the $375 6800GT

If you're going to post the lowest price available for the 6800GT, you should also post the lowest price for an X800XL which is $284 lest you look like a fanboy. 🙂 Also, without the X800XL breathing down it's neck, the 6800GT PCI-Express would still be $470.
 
From what I get from the interview with the Head Engineer who made the game is that they were planning to create a SM 2.0 path before it's release. This article was dated Oct.2004. Between then and now, they scratched that idea completely due to "technical difficulties". Also stating they would have to rebuild the graphic engine to support both 2.0 and 3.0. CT is a game that was made to display the advantages of SM 3.0. It's perfect for it. CT is darker than Doom 3, and I don't see many people bitch'en about that do you?

Now you take it into account what has happened was all due to corporate favoritism. There was no deal made with Nvidia. SM3.0 is a standard, so the decision was an easy one made by Ubisoft. Did you really think Ubisoft was going to start CT from the ground up again losing millions of dollars in development when they knew ATi cards could play it perfectly well? I don't know what you're talking about Larry! It was a well calculated decision made by Ubisoft. You make it seem like CT can't run at all on ATi cards.

In other news. HL2 is creating a HDL patch for SM3.0 that is set to release anytime soon. Far Cry 2 also has patch 1.3 which also includes optimizations for SM 3.0. And hey, guess who made Far Cry? UBI f'ing soft!

This is all FUD. It seems like everybody loses their common sense when there is a hardware war raging on the frontlines of the great American marketing machine. If the hardware that they made cannot influence the game developers or the market, then obviously wouldn't have much a market at all.

To build an empire you must have influence, management, and leadership. Let ATi and Nvidia compete with each other instead of crying "cheating" or "foul play". What they are doing is perfectly confounded by fair business practices. There is nothing "sinister" about it. When I owned a 9800 from ATI I was hearing it from Nvidia owners all to often and it made my stomach cringe.
 
Originally posted by: TheSnowman
Originally posted by: Noob
Too bad the game don't support SM 2.0b. Us X800 users would have got it's performance increases and IQ as SM 3.0 (HDR pending). It was probably a behind-the-scenes deal to promote 6800's not to include support for SM 2.0b. Oh well, the game still looks good and is good. I'm on the 5th mission. Somewhat complicated storyline.

Actually there is Radeons that can do the HDR which is used in SC3

It depends if they wanted to implement it though. That is what I meant. HL2 is gonna have HDR implemented soon.

 
Originally posted by: Regs
Between then and now, they scratched that idea completely due to "technical difficulties". Also stating they would have to rebuild the graphic engine to support both 2.0 and 3.0.

Does "technical difficulties" = "paid by nvidia"? 😉 I don't believe that they would have had to rebuild the graphic engine. Thats BS.

In other news. HL2 is creating a HDL patch for SM3.0 that is set to release anytime soon. Far Cry 2 also has patch 1.3 which also includes optimizations for SM 3.0. And hey, guess who made Far Cry? UBI f'ing soft!

HDR does not equal SM3.0. Show me anything where Valve ever mentions SM3.0 for their new HDR level.

UBI f'ing soft didn't make Far Cry. Crytek did. Ubisoft published it. Hmmm... Crytek also did The Project demo, which *gasp* has HDR support for ATI cards.
 
Originally posted by: dfedders
Originally posted by: Regs
Between then and now, they scratched that idea completely due to "technical difficulties". Also stating they would have to rebuild the graphic engine to support both 2.0 and 3.0.

Does "technical difficulties" = "paid by nvidia"? 😉 I don't believe that they would have had to rebuild the graphic engine. Thats BS.

In other news. HL2 is creating a HDL patch for SM3.0 that is set to release anytime soon. Far Cry 2 also has patch 1.3 which also includes optimizations for SM 3.0. And hey, guess who made Far Cry? UBI f'ing soft!

HDR does not equal SM3.0. Show me anything where Valve ever mentions SM3.0 for their new HDR level.

UBI f'ing soft didn't make Far Cry. Crytek did. Ubisoft published it.


Then I guess UBi soft stayed out of cryteks way? Give me a brake.
 
Back
Top