Splinter Cell :CT SM3 and SM1.1 ***UPDATE WITH PICS***

Jun 14, 2003
10,442
0
0
pics in later post jus scroll down

to kinda leave that thread i started earlier ive jus been playing around with the new splinter cell. it jus came out in the UK today so i picked it up.

so far i like it

but these are the things bugging me

SM3 isnt exactly rapid 20-30fps (unless there are no lights on at all)
but suprisingly Switching to SM1.1 gives zero performance boost.

in SM1.1 i still get 20-30 fps in the same areas, so im sticking with SM3 for now, might as well have the better quality for the same price

another thing, im using 4xaf at the moment, but 8xaf seems to do nothing to FPS.

AA cant be selected at all when you use SM3.....why? im sure u could in the demo, and Bit-tech said the best settings for a 6800gt were 10x7 2xaa 8xaf with SM3

in SM1.1 mode, AA and AF dont seem to have much impact on FPS either

hopefully this is a driver bug.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Maybe you're CPU limited. Your so gaminrig link doesn't work BTW.

When I played the demo, I got twice the framerates from switching to SM1.1.

or it could be bugs as well, not cool.
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
no, not a cpu limit since he ogt a 2.2 a64 newcastle on 1gig of ram. have you tried updating all drivers and bioses?
 

Icopoli

Senior member
Jan 6, 2005
495
0
0
I never played the first 2, but this one is awsome, I turned everything up, 6x AA/16AF @ 1600x1200, high textures shadows etc, and still managed 40fps. Game looks absolutely incredible. Looking forward to the full version.
 
Jun 14, 2003
10,442
0
0
ok heres some shots with FPS counter in FRAPS

settings - SM3 everything on, + hardware shadow maps whatever the hell it does, (doesnt seem to do anything, nor does turning things like paralax mapping off either) 1280x1024 75Hz, 0xAA (wont let u if u use SM3) and 4xAF

caching SM3 shaders
creates SM3
creates SM1
deck SM1
deck SM3
house sm1
house sm3
wall sm1
wall sm3
AMD Marketing at its finest

now as u can see the FPS dont differ all that much most of the time, in some cases SM3 is actually more fluid. but whatever happening dropping down to SM1 doesnt really give a boost

and jus for the info,

MY rig -> A64 3200 newcastle, Gig o Ram, Asus K8NE- Deluxe NForce 3 250Gb, Leadtek 6800GT now @ 402Mhz/1100Mhz, Audigy 2

having played abit more, SM3 isnt half bad actually, 20-30fps for thus game is ok since its so slow paced. SM1 yields nothing but lower IQ, and even then it looks brilliant.

still why cant i use SM3 and AA? it needs it

and its pretty cool, havent had to do a mission where im not allowed to kill, and if u screw up too many times with alarms, the dude on the radio says screw it fisher jus get on with it get your objectives and get the hell out, so u dont fail coz of a daft alarm
 
Jun 14, 2003
10,442
0
0
Originally posted by: mwmorph
no, not a cpu limit since he ogt a 2.2 a64 newcastle on 1gig of ram. have you tried updating all drivers and bioses?


i update drivers as fast as they come out

latest bios, latest NF3 drivers, and the new XG 76.41 drivers, since the 71.84's from XG had the annoying temp bug, and basically perform the same so far
 
Jun 14, 2003
10,442
0
0
Originally posted by: BouZouki
Looks like the 6800 cards cant handle SM3? Hmmm.


hmm u like the flames i think, at the moment tho it seems its a toss up between 1 and 3, they give the same performance basically, now i know they shouldnt, SM1 should be much faster so somethings wrong

i pretty much realised SM3 on the current cards wasnt gonna be good enough for much anyway, but for now i can at least make use of it, and its not too bad. though i really do suspect that wen a real SM3 game emerges the NV40 is going to crash n burn
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BouZouki
Looks like the 6800 cards cant handle SM3? Hmmm.

Only one thing is sure: 6800 cards allow you to see SM3 at Splinter Cell, ATI cards force you to use SM1.1.

Tough break for ATI owners. (again)
 

mwmorph

Diamond Member
Dec 27, 2004
8,877
1
81
are you limiting fps through vsync or any other option. some games cap fps at a specific limit like halo for pc's 30fps option.
 

Ackmed

Diamond Member
Oct 1, 2003
8,491
552
126
Originally posted by: Rollo
Originally posted by: BouZouki
Looks like the 6800 cards cant handle SM3? Hmmm.

Only one thing is sure: 6800 cards allow you to see SM3 at Splinter Cell, ATI cards force you to use SM1.1.

Tough break for ATI owners. (again)


Or NV owners with FX cards?
 

Leon

Platinum Member
Nov 14, 1999
2,215
4
81
You can use AA with SM3 mode, but you'll need to turn off HDR/tone mapping - it's (currently) incompatible with AA, due to hardware limitations.

SM3 adds soft shadows/parallax mapping - so you still be running full feature mode. HDR looks nice though.

Leon
 
Jun 14, 2003
10,442
0
0
Originally posted by: BouZouki
Originally posted by: otispunkmeyer
Originally posted by: BouZouki
Looks like the 6800 cards cant handle SM3? Hmmm.


hmm u like the flames i think,

What the hell are you talking about, I was asking a question.


i detected sarcasm in that question sorry.....if u left the hmmm off at the end it would of look more legit. that looks more like your having a dig at the 6 series....again, since you already had a go for nvidia not supporting win2k
 
Jun 14, 2003
10,442
0
0
Originally posted by: mwmorph
are you limiting fps through vsync or any other option. some games cap fps at a specific limit like halo for pc's 30fps option.


no vsynch is off, the games limit is 100FPS, you can easily reach that using things like EMF vision and Thermal Imaging
 
Jun 14, 2003
10,442
0
0
Originally posted by: Leon
You can use AA with SM3 mode, but you'll need to turn off HDR/tone mapping - it's (currently) incompatible with AA, due to hardware limitations.

SM3 adds soft shadows/parallax mapping - so you still be running full feature mode. HDR looks nice though.

Leon


ahhh, is it like the Farcry HDR?, i know you wouldnt wanna use AA with that anyway since youd be in slide show territory, but does it like use that part of the chip to do either AA or HDR? so u can only have one or the other?

or is it something to do with bloom effects of the light n stuff leaving big glows on walls and edges?
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: Ackmed
Originally posted by: Rollo
Originally posted by: BouZouki
Looks like the 6800 cards cant handle SM3? Hmmm.

Only one thing is sure: 6800 cards allow you to see SM3 at Splinter Cell, ATI cards force you to use SM1.1.

Tough break for ATI owners. (again)


Or NV owners with FX cards?

This is between GF6 and X800.. why you brought FX into it I dont know.

This could get real pointless, watch-

Or ATI owners with Rage3D cards?? BOO-YAH!



Its nice to see holding onto the old "FX sucks" grudge.. but when you graduate from n00b-world.. you'll realize how hard sh!tty ATI's hardware was until they bought out ArtX's R300.
Holding onto past NV hardware missteps is a losing game for the ATI lover.

There aint nothing worse than ATI's legacy video card software support, or ATI's legacy video cards!
 

Ackmed

Diamond Member
Oct 1, 2003
8,491
552
126
Originally posted by: housecat
Originally posted by: Ackmed
Originally posted by: Rollo
Originally posted by: BouZouki
Looks like the 6800 cards cant handle SM3? Hmmm.

Only one thing is sure: 6800 cards allow you to see SM3 at Splinter Cell, ATI cards force you to use SM1.1.

Tough break for ATI owners. (again)


Or NV owners with FX cards?

This is between GF6 and X800.. why you brought FX into it I dont know.

This could get real pointless, watch-

Or ATI owners with Rage3D cards?? BOO-YAH!



Its nice to see holding onto the old "FX sucks" grudge.. but when you graduate from n00b-world.. you'll realize how hard sh!tty ATI's hardware was until they bought out ArtX's R300.
Holding onto past NV hardware missteps is a losing game for the ATI lover.

There aint nothing worse than ATI's legacy video card software support, or ATI's legacy video cards!

It has everything to do with FX cards, and any card that supports PS2.0.

It has to do with a company screwing over their customers. I cant see any reason from the consumer standpoint to not support PS2.0. There are a lot more 2.0 owners, than 3.0 owners, and thats a fact.

If the 8500 was so bad as you claim, why is it faster than a G3, and supports PS1.4, while the G3 doesnt?

Can you honestly say its better for them to release a game with 1.1, and 3.0 only? I mean, honestly? How is that better than releasing it with 1.1, 2.0, and 3.0? Feel free to give me an answer that makes sense. I cant think of any.

Why do you always resort to childing insults? I would be just as upset if it was the other way around. Just as I think Valve should enable 2.0 in HL2 for FX owners by default.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: otispunkmeyer
ahhh, is it like the Farcry HDR?, i know you wouldnt wanna use AA with that anyway since youd be in slide show territory, but does it like use that part of the chip to do either AA or HDR? so u can only have one or the other?

or is it something to do with bloom effects of the light n stuff leaving big glows on walls and edges?
It is like Far Cry's HDR in the sense that it takes advantage of the floating point framebuffer ablities of the exclusive NV4x cards, and they can't do a floating point framebuffer and AA at the same time. Next gen cards will most likely not have that limitation.

Originally posted by: housecat
This is between GF6 and X800.. why you brought FX into it I dont know.

This could get real pointless, watch-

Or ATI owners with Rage3D cards?? BOO-YAH!



Its nice to see holding onto the old "FX sucks" grudge.. but when you graduate from n00b-world.. you'll realize how hard sh!tty ATI's hardware was until they bought out ArtX's R300.
Holding onto past NV hardware missteps is a losing game for the ATI lover.

There aint nothing worse than ATI's legacy video card software support, or ATI's legacy video cards!
He didn't say anything about how good or not good the FX cards are, he simply pointed out that they support shader model 2 and would be able to benift from it if UBI shipped their SM2 code with the game. That goes for the FX cards as well as the 9500-9800 cards, and there are a lot more people with either of those cards than people with x800s and 6800s. As for older ATI cards, all the Radeons had some real strong points when it came to hardware; drivers are where they often fell short.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: Ackmed
Originally posted by: housecat
Originally posted by: Ackmed
Originally posted by: Rollo
Originally posted by: BouZouki
Looks like the 6800 cards cant handle SM3? Hmmm.

Only one thing is sure: 6800 cards allow you to see SM3 at Splinter Cell, ATI cards force you to use SM1.1.

Tough break for ATI owners. (again)


Or NV owners with FX cards?

This is between GF6 and X800.. why you brought FX into it I dont know.

This could get real pointless, watch-

Or ATI owners with Rage3D cards?? BOO-YAH!



Its nice to see holding onto the old "FX sucks" grudge.. but when you graduate from n00b-world.. you'll realize how hard sh!tty ATI's hardware was until they bought out ArtX's R300.
Holding onto past NV hardware missteps is a losing game for the ATI lover.

There aint nothing worse than ATI's legacy video card software support, or ATI's legacy video cards!

It has everything to do with FX cards, and any card that supports PS2.0.

It has to do with a company screwing over their customers. I cant see any reason from the consumer standpoint to not support PS2.0. There are a lot more 2.0 owners, than 3.0 owners, and thats a fact.

If the 8500 was so bad as you claim, why is it faster than a G3, and supports PS1.4, while the G3 doesnt?

Can you honestly say its better for them to release a game with 1.1, and 3.0 only? I mean, honestly? How is that better than releasing it with 1.1, 2.0, and 3.0? Feel free to give me an answer that makes sense. I cant think of any.

Why do you always resort to childing insults? I would be just as upset if it was the other way around. Just as I think Valve should enable 2.0 in HL2 for FX owners by default.

Oh geez.

Well man, let me tell you-
when ATI dumps millions into working with developers like Nvidia does.. maybe games will support 1.1, 2.0 and 3.0.

But until then, NV paid to worke on this one.. hence 1.1 and 3.0.
Thats just too damn bad for ATI fanboys. They spend millions working with developers.. you don't and neither does ATI.


If the 8500 was so bad as you claim, why is it faster than a G3, and supports PS1.4, while the G3 doesnt?
Is that the furthest your experience or memory goes?
I'm not going to argue about ancient history. Its hard enough pounding today's reality into your skull, let alone the past.

Can you honestly say its better for them to release a game with 1.1, and 3.0 only? I mean, honestly? How is that better than releasing it with 1.1, 2.0, and 3.0? Feel free to give me an answer that makes sense. I cant think of any.
Already answered this.
But of course it would be nice, but NV's dropping the dough.

You either jump on the NV bandwagon, and use the GPUs that more games are being optimized for.. or dont.
But NV is paying for all this, so you can't really complain ackmed.

Why do you always resort to childing insults? I would be just as upset if it was the other way around. Just as I think Valve should enable 2.0 in HL2 for FX owners by default.
Because it feels good, thats why. But I dont think I've outright attacked you.. yet. ;)

If ATI is so tight with valve as everyone thinks they are.. I dont see why they dont turn on 2.0 for FX owners.. but reality must be that they paid for sponsorship.. and did not in fact help with development as NV does with games.
Otherwise, I'd think that your wishes would be true.

Heck I dont care if they did.. I never bought a FX besides the 5200 (the only one I support, since its a great PCI card for those Dells with no AGP slot, along with the 5700LE PCI).

ATI should do that, but like I said-

The FX is history now. It wouldnt really be any big smackdown if ATI got Valve to do that on HL2.
It'd made a differnce when it was NV's "shining glory" but its ancient history now.. and no one cares.
Apparantly neither ATI nor Valve.. but you do ;)
 

Ackmed

Diamond Member
Oct 1, 2003
8,491
552
126
I agree, not even worth a response. Someone with that much of a closed mind, wont see any answer but the one he wants to see.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: Ackmed
I agree, not even worth a response. Someone with that much of a closed mind, wont see any answer but the one he wants to see.

Nice typical response that could be posted in response to ANY general smackdown that one recieved.

Closed mind?
Its just the TRUTH.
This is a BUSINESS you know? Maybe it was the millions of dollars Nvidia dumps into developer programs, maybe the inherent advantages which come from that is why you are so frustrated? :D