The Ultimate SM3.0 Game Thread

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: apoppin
Originally posted by: Bar81
Unfortunately, I think that's the way he "speaks" English ;) I guess what I was trying to say is maybe he's being intentionally vague for some reason. I just don't like the way he said it, but then I might be getting all excited over nothing. I just wish nvidia would clear up the whole SM3.0 and game issue so we wouldn't have to speculate.
i think he is actually talking about nVidia's architecture - which IS in nVidia's control - Not SM 3.0 (which i believe M$ controls and updates, not nVidia).

And we're now talking the same language . . . you and i :)
:cool:

i DO finally agree with your last posts . . . . but you have to realize WHY nVidia is being so vague . . . .

here's my LAST point (call it a "summary" of my POV):
if you like and want to play:

HL2
Lord of the Rings: Battle for Middle Earth
STALKER: Shadows of Chernobyl
Splinter Cell 3
Tiger Woods 2005
Vampire: Bloodlines
Madden 2005
Driver 3.0
Grafan
Medal of Honor: Pacific Assault
Painkiller and BOoH
Far Cry

. . . with ALL the effects enabled - including SM 3.0 - you NEED a 6000 series card from nVidia or else WAIT for ati's r520. IF you do NOT care to experience, SM 3.0 then choose ati.

i know for ME - IF i was spending $500 on a brand new videocard - what i would get. ;)
(but then, that's just me) :p


btw all those EA games dropped SM3.0 support so they're not on the list any longer. Has rotund Gabe made any declarations re SM3.0 in HL2 or are we just going by this article? As to the rest of the games I'm gonna do some digging tomorrow and see if the list is accurate or there are some more sudden drops of SM3.0 from released titles.

Well, I'm of the opinion that experiencing SM3.0 doesn't matter, it's what SM3.0 adds to the experience and the question remains, if what it adds is a huge drop in performance where the game no longer plays smoothly then it doesn't add anything. On the other hand, if Riddick and allegedly, SC CT are aberrations then maybe SM3.0 will enhance the 6800 user experience. I just think there's enough drastic negative re SM3.0 on 6800 that it's *highly* questionable whether the good 6800 SM3.0 scenario is the reality. Nothing that nvidia has done or said so far should make anyone feel otherwise.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: apoppin
Originally posted by: Bar81
Originally posted by: apoppin
Originally posted by: keysplayr2003
Originally posted by: Bar81
Well, to add further to the issue at hand, here's an interesting quote from nvidia's financial report regarding the upcoming nvidia cards:

?Well, from an architecture standpoint we?re just still at the beginning of shader model 3.0. And we need to give the programmers out there some time to continue to really learn about that architecture. So in the spring refresh what you?ll see is a little bit faster versions...

...I think you?ll see the industry move up a little bit in performance. But I don?t think you?ll see any radical changes in architecture. I doubt you?ll see any radical changes in architecture even in the fall. When we came out with GeForce 6, we tend to create a revolutionary architecture about every actually two years. And then we derive from it for the following time. So even the devices that we announced this fall, that will be I think a lot more powerful than the ones we actually had a year ago. Architecturally we?re still in the shader model three type era.?

http://www.beyond3d.com/#news20937

I think this could be potentially devestating news for SM3.0 on current cards. If nvidia is making what amounts to a SM3.0B spec for their next generation cards what does that tell us about today's implementation of SM3.0 and it's usefullness in upcoming games played with a 6800 series card?

Or even more so, what does this say about ATI's SM2.0b?
that's a pretty eXtreme conclusion about "3.0b" drawn from the 6900 series being a LITTLE BIT faster than the current 6800 series.

expect only ATI to make big changes to their r520 core . . . nVidia will rely on a core and memory speed bump and SLI for their top cards ;)

the 6800gt/ultra will be fine for the next year or two in SM 3.0 aps. . . .


I can't read the guy's mind but that's what, to me, it seems he's saying. Maybe he's just being obtuse, I couldn't tell you, but the way I'm reading it, I wouldn't like what he was saying if I were a 6800 user.

If what he wanted to say was that by the time the refreshes come developers will have learned to more efficiently code in SM3.0 I would expect he would say that. Except he doesn't; he says you'll see faster versions. That makes me think a revision to the implementation or spec of SM3.0 is being contemplated. Of course, he may just not know how to properly word his thoughts but going off what he's saying, I'm inclined to believe that revisions are a coming.

again, the "revisions" to SM 3.0 are out of nVidia's control . . .. Microsoft determines it (i believe).

clearly he is speaking of what he DOES have control over - nVidia's architecture . . . . we will see faster versions that will run SM 3.0 more efficiently . . . incrementally upgraded for the next two years until the NEXT 'brand-new' core (to run DX10).


He could also be saying that their new cards will feature revisions to actually implement full SM3.0 (maybe they haven't right now) The point is it's so unclear and we're doing so much speculating that a potential 6800 purchaser needs to understand that before making the call. The case against or for SM3.0 on the 6800 simply isn't chock full of evidence for either side, and one has to make the call for themselves whether they have faith in the 6800 SM3.0 implementation or not. As we've seen there are people on both sides of the camp.
 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
BTW why isn't SC:CT going to support SM 2.0. Is this just to make people with ATI cards miserable? Or is it going to make people with 6800's miserable because UbiSoft might not optimize the code enough for SM 3.0. It seems like people with either brand card are going to lose out on the image quality and performance of SM 2.0.
 

acx

Senior member
Jan 26, 2001
364
0
71
Splinter Cell: Chaos Thoery is SM 3.0 and SM 1.1 only. I don't think there is a codepath for SM 2.0 in that game.
 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
Originally posted by: acx
Splinter Cell: Chaos Thoery is SM 3.0 and SM 1.1 only. I don't think there is a codepath for SM 2.0 in that game.

Will you be getting Shader 1.1 detail like MOHPA? Having SM 1.1 in that game was like having no Shaders at all. Although this doesn't seem to be the case in the game. It looks like it's SM 2.0 quality to me. But that doesn't mean anything.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Noob
BTW why isn't SC:CT going to support SM 2.0. Is this just to make people with ATI cards miserable? Or is it going to make people with 6800's miserable because UbiSoft might not optimize the code enough for SM 3.0. It seems like people with either brand card are going to lose out on the image quality and performance of SM 2.0.

Noob-
For people with nV40s, SM3 only increases performance. The nV40/R520 people won't lose anything.

Only the people who still have Radeon 9500-X800s will lose.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Rollo

For people with nV40s, SM3 only increases performance. The nV40/R520 people won't lose anything.

Only the people who still have Radeon 9500-X800s will lose.


And Ubisoft for sales lost due to no SM2.0 support.

 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
Originally posted by: Rollo
Originally posted by: Noob
BTW why isn't SC:CT going to support SM 2.0. Is this just to make people with ATI cards miserable? Or is it going to make people with 6800's miserable because UbiSoft might not optimize the code enough for SM 3.0. It seems like people with either brand card are going to lose out on the image quality and performance of SM 2.0.

Noob-
For people with nV40s, SM3 only increases performance. The nV40/R520 people won't lose anything.

Only the people who still have Radeon 9500-X800s will lose.

I know that. But will the SM 3.0 code be optimized enough to run smoothly. That is of course if the game is truely a SM 3.0 game. I would assume that you would see an image quality difference going from 1.1 to 3.0. I know that you wouldn't see a difference between 2.0 and 3.0.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Bar81
Originally posted by: apoppin

again, the "revisions" to SM 3.0 are out of nVidia's control . . .. Microsoft determines it (i believe).

clearly he is speaking of what he DOES have control over - nVidia's architecture . . . . we will see faster versions that will run SM 3.0 more efficiently . . . incrementally upgraded for the next two years until the NEXT 'brand-new' core (to run DX10).


He could also be saying that their new cards will feature revisions to actually implement full SM3.0 (maybe they haven't right now) The point is it's so unclear and we're doing so much speculating that a potential 6800 purchaser needs to understand that before making the call. The case against or for SM3.0 on the 6800 simply isn't chock full of evidence for either side, and one has to make the call for themselves whether they have faith in the 6800 SM3.0 implementation or not. As we've seen there are people on both sides of the camp.
that's not quite right . . . nVidia's 6000 series already have "full" SM 3.0 capabilities . . .. as to the "future" , of course it's up to each consumer to decide for himself. . . .

the choice itself is totally clear . . . x800 or 6800 . . . . you really can't "lose" with EITHER choice . . . after all the upgrade is not irrevocable or "permanent" . . . . most people will upgrade again within 3 years. ;)
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: Bar81
Well, to add further to the issue at hand, here's an interesting quote from nvidia's financial report regarding the upcoming nvidia cards:

?Well, from an architecture standpoint we?re just still at the beginning of shader model 3.0. And we need to give the programmers out there some time to continue to really learn about that architecture. So in the spring refresh what you?ll see is a little bit faster versions...

...I think you?ll see the industry move up a little bit in performance. But I don?t think you?ll see any radical changes in architecture. I doubt you?ll see any radical changes in architecture even in the fall. When we came out with GeForce 6, we tend to create a revolutionary architecture about every actually two years. And then we derive from it for the following time. So even the devices that we announced this fall, that will be I think a lot more powerful than the ones we actually had a year ago. Architecturally we?re still in the shader model three type era.?

http://www.beyond3d.com/#news20937

I think this could be potentially devestating news for SM3.0 on current cards. If nvidia is making what amounts to a SM3.0B spec for their next generation cards what does that tell us about today's implementation of SM3.0 and it's usefullness in upcoming games played with a 6800 series card?

That's not what's being said at all... did you miss the part saying "I don't think you'll see any radical changes in architecture?" What does that mean? To me it means they'll bump up the speed to handle HDR and displacement mapping without halving frame rates.
 

acx

Senior member
Jan 26, 2001
364
0
71
There is a long thread over at Beyond3d's forums about it in the games section. Some SM3 vs SM 1.1 images over at Shacknews.

Here is a short SC:CT developer comment on SM3.0
 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
All they did was turn up the gamma. You can tell because the health bar gets brighter. As well as the weapon and ammo interface. Open up 2 browsers, one with 1.1 and one with 3.0. Keep switching back and forth using the taskbar and you will see.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: keysplayr2003
I can't make it any easier for you. 6xxx series cards were designed for SM3.0+.
We aren't talking about buying a FX5200Ultra with 256MB of RAM here. That card barely had enough power to push 128. However, the 6800's are powerhouses and should be able to have lasting SM3.0 performance for years to come.
That's the problem - where's the proof? I can say that the 6800's ability to support SM3.0, is much akin to the FX 5200 supporting DX9 shaders, and you can say otherwise, but in either case, due to the lack of available data, both are simply opinions on the opposite ends of the spectrum. It's definately not an "open and shut case for SM3.0" yet, not by a long shot. One positive SM3.0 feature though, that unlike branching, shouldn't incur any potential slowdown from using, is the ability of the VS units to access source-texture data from the PS units, which is a big, big win in terms of the ease of implimenting displacement-mapping. So for games that impliment DM that way, having a SM3.0 card is a win in that case, I'll admit that.
 

acx

Senior member
Jan 26, 2001
364
0
71
That is definitely not just gamma. The shading on the rocks have a wider range of lighting.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Noob
BTW why isn't SC:CT going to support SM 2.0. Is this just to make people with ATI cards miserable? Or is it going to make people with 6800's miserable because UbiSoft might not optimize the code enough for SM 3.0. It seems like people with either brand card are going to lose out on the image quality and performance of SM 2.0.

Lose out? If you can't afford to upgrade in the computer world and expect to keep up with the technology, then don't play the game. The computer world moves fast for enthusiasts. Keep up, or get off the bus.

 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
Originally posted by: keysplayr2003
Originally posted by: Noob
BTW why isn't SC:CT going to support SM 2.0. Is this just to make people with ATI cards miserable? Or is it going to make people with 6800's miserable because UbiSoft might not optimize the code enough for SM 3.0. It seems like people with either brand card are going to lose out on the image quality and performance of SM 2.0.

Lose out? If you can't afford to upgrade in the computer world and expect to keep up with the technology, then don't play the game. The computer world moves fast for enthusiasts. Keep up, or get off the bus.

Did I say I can't keep up. I was merely asking why SC3 won't support SM 2.0 which was answered. It didn't make sense why they would exclude 2.0. And if you looked at my sig you would see that I have an X800 Pro. So you can't tell me I can't keep up. That was such an idiotic comment that you just made.

 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: Bar81
The thing we're trying to figure out is if SM2.0 effects enabled through SM3.0 will result in such a huge performance hit that they will for all intents and purposes be rendered useless to the 6800 crowd such that having SM3.0 in essence becomes a non-feature. That's why when I hear stuff like the quoted from nvidia's conference call it makes me think that maybe nvidia knows this, but even if that were the case they sure as heck aren't going to admit it. Then again, it could be the case that the 6800 cards will be fine.

"Branching shaders in SM3.0 - blessing or curse? Next on Geraldo!"
 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
Originally posted by: acx
That is definitely not just gamma. The shading on the rocks have a wider range of lighting.

Yah you are right. I was jus tlooking at the character model and say that each screeny had Sam in a slightly different position, just enought for his body to block the lights off his hand for example. But now I see what you are saying. But without a doubt they did turn up the gamma to make the game seem more vibrant.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: Jeff7181
Originally posted by: Bar81
I think this could be potentially devestating news for SM3.0 on current cards. If nvidia is making what amounts to a SM3.0B spec for their next generation cards what does that tell us about today's implementation of SM3.0 and it's usefullness in upcoming games played with a 6800 series card?
That's not what's being said at all... did you miss the part saying "I don't think you'll see any radical changes in architecture?" What does that mean? To me it means they'll bump up the speed to handle HDR and displacement mapping without halving frame rates.
But likewise, that implies that current SM3.0-capable parts, have performance issues that will result in halved frame-rates when those gfaphics features are enabled in-game. Hmm.

Also, when an NV spokesperson talks about minimal changes in architecture, that doesn't mean that you might not see radical changed in the hardware implimentation of that architecture. What is I mean is that he might have been speaking of the software architecture/programming-model of SM3.0, which likely there won't be any major revisions to again any time soon. But that doesn't mean that there might not be potential performance issues with current-gen SM3.0 parts, and that NV might not take some radical steps with next-gen parts to alleviate some of those issues (such as implementing "virtual pipelines" and SMT). Still, without proof, this is all just speculative commentary on my part, but it seems logical to me.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: VirtualLarry
Originally posted by: Jeff7181
Originally posted by: Bar81
I think this could be potentially devestating news for SM3.0 on current cards. If nvidia is making what amounts to a SM3.0B spec for their next generation cards what does that tell us about today's implementation of SM3.0 and it's usefullness in upcoming games played with a 6800 series card?
That's not what's being said at all... did you miss the part saying "I don't think you'll see any radical changes in architecture?" What does that mean? To me it means they'll bump up the speed to handle HDR and displacement mapping without halving frame rates.
But likewise, that implies that current SM3.0-capable parts, have performance issues that will result in halved frame-rates when those gfaphics features are enabled in-game. Hmm.

Also, when an NV spokesperson talks about minimal changes in architecture, that doesn't mean that you might not see radical changed in the hardware implimentation of that architecture. What is I mean is that he might have been speaking of the software architecture/programming-model of SM3.0, which likely there won't be any major revisions to again any time soon. But that doesn't mean that there might not be potential performance issues with current-gen SM3.0 parts, and that NV might not take some radical steps with next-gen parts to alleviate some of those issues (such as implementing "virtual pipelines" and SMT). Still, without proof, this is all just speculative commentary on my part, but it seems logical to me.


Just got back, there's your answer apoppin, said much more eloquently than I could have said it.
 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
Can anyone with a 6800 posts a couple of pics comparing 1.1 vs. 3.0? jsut want to see the image quality difference with no gamma alteration.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
If you're not going to stay on point then get out of the discussion.
I am on point, you just can't read and/or comprehend.

You can't just make the discussion about whatever you want.
Isn't it clear that the discussion is about SM 3.0? You can't just ignore arguments because they don't fit into your childish view of the world.

Please get an education and learn what ignorant means.
I'll do that right after you have your twelfth birthday.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Bar81
Originally posted by: VirtualLarry
Originally posted by: Jeff7181
Originally posted by: Bar81
I think this could be potentially devestating news for SM3.0 on current cards. If nvidia is making what amounts to a SM3.0B spec for their next generation cards what does that tell us about today's implementation of SM3.0 and it's usefullness in upcoming games played with a 6800 series card?
That's not what's being said at all... did you miss the part saying "I don't think you'll see any radical changes in architecture?" What does that mean? To me it means they'll bump up the speed to handle HDR and displacement mapping without halving frame rates.
But likewise, that implies that current SM3.0-capable parts, have performance issues that will result in halved frame-rates when those gfaphics features are enabled in-game. Hmm.

Also, when an NV spokesperson talks about minimal changes in architecture, that doesn't mean that you might not see radical changed in the hardware implimentation of that architecture. What is I mean is that he might have been speaking of the software architecture/programming-model of SM3.0, which likely there won't be any major revisions to again any time soon. But that doesn't mean that there might not be potential performance issues with current-gen SM3.0 parts, and that NV might not take some radical steps with next-gen parts to alleviate some of those issues (such as implementing "virtual pipelines" and SMT). Still, without proof, this is all just speculative commentary on my part, but it seems logical to me.


Just got back, there's your answer apoppin, said much more eloquently than I could have said it.

what?

might . . . might .. . maybe . . . . doesn't mean . . . . might not . . . . speculative and . . . without proof . . . seems logical

yeah, that's pretty eloquent

:D