The difference SM3 makes on a 6600gt

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I can't believe we're still arguing about this whole issue, but Farcry with sm3 has been out for a while now, and nobody cared to dig up the results. So, here's some benches that compare the difference the sm3 codepath makes in Farcry using a 6600gt:

http://www.techreport.com/reviews/2004q3/geforce-6600gt/index.x?pg=8
10x7, 4x8x: 44.3fps (sm2) vs. 43.2fps (sm3)
For some reason the fps actually went down a little...

http://www.techreport.com/reviews/2004q3/geforce-6600gt/index.x?pg=9
10x7, 4x8x: 34.2fps (sm2) vs. 36fps (sm3)
Not much of a difference there...

http://www.techreport.com/reviews/2004q3/geforce-6600gt/index.x?pg=10
10x7, 4x8x: 54.5fps (sm2) vs. 60.7fps (sm3)
The best case scenario...

The best improvement was in the volcano level, which increased fps by about 11%. In the other levels, the improvements were less than that. So, here you have it - what to expect from sm3 in terms of speed improvement in Farcry. We'll see if future games like FEAR will see a bigger or smaller improvement.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
It really depends on the map in Far Cry. For example, the "Volcano" and "Research" levels get a nice boost, while the "Training" and "Regulator" levels do not.

Let's take the Reseach level first.

At 1024x768 with 4xAA 8xAF, according to techreport

The 6800GT gets beat by the x800 Pro in SM2........... 68fps vs. 76fps.
Yet, when SM3 is chosen, the 6800GT jumps to 88. That's a significant increase, and puts it ahead.

On the flipside, the Regulator level doesn't fare as well.....
x800 Pro-- 48
6800GT SM2-- 53
6800GT SM3-- 56 (which even bests the x800 PE at 53)

The "Training" level shows similar results.

But the Volcano level again shows gains like the Research level.

6800GT SM2-- 82
6800GT SM3-- 93

All I'm trying to show is that it is often easy to spot a single benchmark or two that shows an advantage or none at all. My thought is that it is probably better to have a feature (even if it is a little-used one) than not have a feature, as long as the price difference isn't substantial.

Those that compared results of an x800 Pro vs. a 6800GT might have had 2nd thoughts had they known that there was an extra 10% headroom available built-in to the 6800GT when running something coded with that feature.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Where is hans, by the way? This thread has been here for half an hour already and he hasn't made his usual SM3.0 sales pitch yet. Must be sick or something.
 

Todd33

Diamond Member
Oct 16, 2003
7,842
2
81
What about other resolutions and AA/AF? We can always cherry pick results to show one side.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: Todd33
What about other resolutions and AA/AF? We can always cherry pick results to show one side.

That's exactly my point to the OP. I just chose the exact same res and AA/AF he had.

However, one thing that the benchmarks all agree on is that SM3 doesn't slow anything down. In some cases, it helps. That could be defined as a useful feature then. To me personally, I don't own a single SM3 game. So, to me, it hasn't exactly been useful. But, sure, I guess I'm glad it's there. Maybe one day I'll actually use it. Hell, if it weren't useful at all, ATI wouldn't have it on their 520. And that's probably where it will truly be used well-- next gen games on next gen hardware. I hope to hell my GTs handle HDR and SM3 fine on AOE3 when it arrives, or it's upgrade time for me.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Yeah, this thread exists because at least 1 person seems to think that sm3 will trasform his 6600gt into a x800-crushing monster, when so far in almost all benches it has been the other way around. So, I decided to find out just how much is sm3 likely to improve the performance of a 6600gt.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Todd33
What about other resolutions and AA/AF? We can always cherry pick results to show one side.

What about em? The benches show similar differences between sm2 and sm3 at other resolutions, with or without AA/AF. I picked 10x7 4x8x because it seems like the best balance between low fps at 16x12 and bad IQ without AA/AF.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: munky
Yeah, this thread exists because at least 1 person seems to think that sm3 will trasform his 6600gt into a x800-crushing monster, when so far in almost all benches it has been the other way around. So, I decided to find out just how much is sm3 likely to improve the performance of a 6600gt.

So, Hans thinks a 8 pipe 6600GT -w- 128MB 128-bit mem bus can beat out a 12 pipe X800
-w- 128MB 256-bit mem bus? Because of SM3.0? Did he actually say this? or did he just say that he would just take the SM3.0 card over the SM2.0 card?

Munky, you should know better than to let that get to you so much that you needed to make a thread about it. A 6800nu has the exact same specs (besided clock speed) of the X800 and beats out a 6600GT most of the time. And the X800 will beat out a 6800nu most of the time (higher clocks). This was more or less a tantrum thread.

 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: keysplayr2003
Originally posted by: munky
Yeah, this thread exists because at least 1 person seems to think that sm3 will trasform his 6600gt into a x800-crushing monster, when so far in almost all benches it has been the other way around. So, I decided to find out just how much is sm3 likely to improve the performance of a 6600gt.

So, Hans thinks a 8 pipe 6600GT -w- 128MB 128-bit mem bus can beat out a 12 pipe X800
-w- 128MB 256-bit mem bus? Because of SM3.0? Did he actually say this? or did he just say that he would just take the SM3.0 card over the SM2.0 card?

Munky, you should know better than to let that get to you so much that you needed to make a thread about it. A 6800nu has the exact same specs (besided clock speed) of the X800 and beats out a 6600GT most of the time. And the X800 will beat out a 6800nu most of the time (higher clocks). This was more or less a tantrum thread.

Yeah, most people already know that an x800 +sm2 will beat a 6600gt + sm3, but some people won't believe it until they see it...
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Maybe someone should run some benchies so we don't have to rely on results that are over a year old.:)
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
It does prove that sm 3.0 makes some difference, but with a 6600GT pretty small. If the X800GT is in fact an 8-pipe 256bit card, the 6600GT has no chance with it's 8-pipe 128bit mem bus.

A 256bit mem bus is by far a better feature than SM 3.0 alone, because all games benifit right now as well as future games.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Sorry I'm late guys! :D

You're all taking this way too seriously, because I'm just giving my opinion.

I know that an x800 beats a 6600gt at Far Cry with Sm3 on. I've seen those benchies. Frankly, I'm happy with my performance for any game out now with my 6600gt.

When ever I'm talking about SM3, it really applies only next gen games in my mind. I could care less that Far Cry or SC:CT have Sm3 considering it runs great on my card without it.

Next gen, I'd like to be able to use SM3 on the games, even if it doesn't add eye candy, but maybe a performance boost, considering the 6600gt probably wont run UT2007 at highest settings. Still, just running it at lowest settings on Sm3 will give me a boost in performance.

Plus, with the FEAR game, which is 1/2 next gen, it seems that SM3 gave the 6800gt a boost (big enough to beat the x850, and did you know that FEAR is an ATI game? Hmmm). Considering the X850 usually kills the 6800gt, I'd assume it wasn't a small boost. I expect that when games use SM3 more (Far cry didn't use it much, SC:CT used it a little more of it), and FEAR is an example of a game that uses it more, they will increase performance even more.

So, I don't care about any game out now, SM3 or not, because my card runs it fine (maybe not FEAR, though, we'll see). Sure, maybe I dont get the best resolution, but I'm one that could care less if its on 10x7 with no AA/AF.

Right, time for me to see Charlie and the Chocolate Factory :D

And if I end up being wrong, rub it in my face. Please. If you're wrong, i'll be nice though.

Have a nice day.
 

blckgrffn

Diamond Member
May 1, 2003
9,676
4,308
136
www.teamjuchems.com
LOL, I still think it is funny that it comes up (usually mentioned by you) every time we talk about a mid range video card. I mean, it is just funny :p
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: zendari
Try the 6600 gt instead of the 6800 gt deadsquirrel.

Don't have those benches handy. But it's moot anyway cuz I didn't realize that this was just a vendetta thread... though I did find it odd that the OP singled out the 6600, which is why I felt that showing other benches from other levels with another card could prove exactly what Todd33 said--
Originally posted by: Todd33
We can always cherry pick results to show one side.

Right now, for the few games that support it, at the resolutions I play at, 5 or 6 fps isn't something I'd be championing SM3 for.

But to Hans' point, we don't know how the implementation of future SM3-enabled games will improve the fps. A 10-20pt fps increase in an older game like FarCry at 1024x768, on a 6800GT no less, doesn't give me a reason to think a future game like UT2007 will run well at all a 6600GT or even a 6800GT. I think both will struggle greatly. SM3 or not.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Originally posted by: deadseasquirrel
Originally posted by: zendari
Try the 6600 gt instead of the 6800 gt deadsquirrel.

Don't have those benches handy. But it's moot anyway cuz I didn't realize that this was just a vendetta thread... though I did find it odd that the OP singled out the 6600, which is why I felt that showing other benches from other levels with another card could prove exactly what Todd33 said--
Originally posted by: Todd33
We can always cherry pick results to show one side.

Right now, for the few games that support it, at the resolutions I play at, 5 or 6 fps isn't something I'd be championing SM3 for.

But to Hans' point, we don't know how the implementation of future SM3-enabled games will improve the fps. A 10-20pt fps increase in an older game like FarCry at 1024x768, on a 6800GT no less, doesn't give me a reason to think a future game like UT2007 will run well at all a 6600GT or even a 6800GT. I think both will struggle greatly. SM3 or not.

Yes, and in that case, if my card doesnt do well, I hardly think the x800 would do any better.

Still, you have to understand that I'm one who has a computer for like...5 years, because frankly, im only a sophmore in high school now. You think I can spend all my $300 in the bank on a computer? I think not. I read about SM3 before purchasing my 6600gt. Somehow, I just realized it'd be used in next gen games...or, did i read that somewhere? i dont know.

You do have a point though. I'm just seeing a pattern here. we saw a small boost in performance with Far Cry using Sm3, and no image quality added. We saw another boost in SC:CT WITH eye candy added (no HDR) and it still beat SM1.1 (i could use 2.0 benchies). Now, with FEAR we have a bigger boost in performance. 6800gt beats an x850 in an actuall ATI game? I'd assume it was a rather large boost.

Now, with next gen, i am ASSUMING that the performance boost for the 6 series on SM3 will bring them ahead of the ATI cards, even if it runs crappy or leaves you with no eye candy. extra performance is always good. PLUS, if the 6 series does run the games well (not saying they will, but if they do) then SM3 DOES add a good bit of eye candy that SM2 either cannot do, or gets too stressed to do. I'm also leaning on the fact that the UE3 devs claimed the 6600gt would run games based on the engine fine. Now, they didn't say "IT WILL RUN AT UBER GRAPHICS!!!!3@".

Last time i checked, i dont think anyone would think a 9200 would run HL2 at 800x600 (given enough ram) at medium high settings. Getting 30fps isn't too bad to play a game at.

I think the same will happen with next gen. You know, technology/hardware/software/games all kinda follow a patter.

But I do appreciate deadseasquirrel for bringing up a point in a polite manner.

Right, I'm off to go get some of my huge amounts of cash out of the bank. about $15.

And just to let you know, i wouldn't be here arguing if none of you said that getting a 6 series card was stupid because it had SM3...I was just giving my opinion to someone who asked for advice.

I like donuts.

 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
At the moment GPUs run SM2 and SM1.1 or 1.4 over a few times, because essentially there is a block of code for say like 1.1 but there are multiple blocks to make up one shader pass or something like that.

When SM3 came around, all those blocks could fit into the SM3 block as it was massive, so shaders are done in one pass instead of many just to make up 1 shader.

Do you see the pattern? So when SM3 is used more, which will be by next year look at all the games which are slated to use it either with the UE3.0 engine or something else. Well when its being used more, more quality, information and texture is being put into SM3 it will eventually fill out too much, so then SM3 will have to be what SM1.1 is to us now, it will have to be done in multiple passes to make one massively long shader.

I tried to make it simple, tho it might sound pretty wierd. So conclusion the 6800 would probably not have the power to do multiple passes of SM3 or maybe even 1 pass of not full SM3 shader block. Remember Epic said their Unreal 3.0 enigine no where near fills up the 65 - 66000 shader length that SM3 has.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
If they say it wont fit the shader length, then it probably wont.

But yes, eventually, SM3 will end up being like Sm1.1...it's all just one big pattern...but frankly, i'd rather not think about next next gen.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Drayvn
At the moment GPUs run SM2 and SM1.1 or 1.4 over a few times, because essentially there is a block of code for say like 1.1 but there are multiple blocks to make up one shader pass or something like that.

When SM3 came around, all those blocks could fit into the SM3 block as it was massive, so shaders are done in one pass instead of many just to make up 1 shader.

Do you see the pattern? So when SM3 is used more, which will be by next year look at all the games which are slated to use it either with the UE3.0 engine or something else. Well when its being used more, more quality, information and texture is being put into SM3 it will eventually fill out too much, so then SM3 will have to be what SM1.1 is to us now, it will have to be done in multiple passes to make one massively long shader.

I tried to make it simple, tho it might sound pretty wierd. So conclusion the 6800 would probably not have the power to do multiple passes of SM3 or maybe even 1 pass of not full SM3 shader block. Remember Epic said their Unreal 3.0 enigine no where near fills up the 65 - 66000 shader length that SM3 has.

Well, Drayvn, to credit, or discredit your analysis of how SM3.0 will be implemented would require a game, or even a properly coded demo done from the ground up strictly adhering to SM3.0 guidelines. I happen to think that the 6 series cards, the ones that are SM3.0 compliant (not just compatable) will do better than all others when it comes to ground up SM3.0 titles and that they are coded correctly. I could be wrong, but I doubt it because there are specs to follow and nvidia did that. As long as Nvidia has done this and software developers follow spec and code correctly, there should be a noticable difference between an SM2.0b card and an SM3.0 card when it comes to ground up SM3.0 titles done right.
Now the 7 series from nvidia is reported to be WGF 1.0 compliant (Windows Graphics Foundation) which is rumored to replace or coexist with DX. Nvidia is on the ball since the 5xxx stumble. I loved my 5900Ultra so don't get me wrong. It was a powerful card. But, there were the DX9 issues.