The Ultimate SM3.0 Game Thread

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Bar81

Banned
Mar 25, 2004
1,835
0
0
If you're not going to stay on point then get out of the discussion. You can't just make the discussion about whatever you want. Maturity involves understanding boundaries.

Please get an education and learn what ignorant means. You're killing me with your profound ignorance.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Bar81
Because many of you obviously failed english I'm going to go over this AGAIN:


This thread is simply trying to determine whether SM3.0 is a factor that should be legitimately considered when making a purchasing decision on this generation of cards. To wit, does having SM3.0 confer any sort of tangible advantage such that one could say that having a SM3.0 card is a "must" over a card without the feature. To determine that I am gathering a list of SM3.0 enabled games currently available and forthcoming in the near future to allow people to answer the question on their own.

And i am voting "Yes"

SM 3.0 - IMO - is a factor i would consider if i was buying a videocard today

Because:

1) upcoming games - shortly - will use SM 3.0 and they will use it efficiently - unlike most of the current "SM 3.0 enabled" games.

2) ATI is in this with nVidia in providing SM 3.0 - game gevelopers WILL use this feature universally

The exceptions: IF you UPgrade your video EVERY year . . . . pick whichever card w/o regard to SM 3.0 if you don't mind "missing out" on the games that DO feature it

You Bar81, do NOT know which upcoming games will feature SM 3.0 or not . . .
. . .. your "list" is illusory.

There . . . i answered your "new" restated topic without flaming and in as reasonable a manner as i can manage after being insulted by you for disagreeing.

i'd like to see you answer my points.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
You gave an opinion without flaming, good for you. Except the thread HAS NOTHING TO DO WITH YOUR OPINION. If you are making an informed decision as to SM3.0 and a new graphic card purchase that's all I ask. Enjoy your 6800 card.


As to the list, it only includes what's been announced. If there are SM3.0 supporting games that I haven't mentioned please tell me what they are so I can add them to the OP.

As to whether
"1) upcoming games - shortly - will use SM 3.0 and they will use it efficiently - unlike most of the current "SM 3.0 enabled" games. "

I'm interested to see if that's the case. Unfortunately we don't know as the only evidence we have is listed in the original post. Anything more is pure speculation and you have to decide whether a speculatively useful feature is a feature you have to have, I can't make that decision for anyone but myself.

As to:
"2) ATI is in this with nVidia in providing SM 3.0 - game gevelopers WILL use this feature universally "

I think you are correct. But again, the issue is whether the current generation 6800 cards will be able to run at acceptable framerates when these new PS2.0 shader effects are implemented via SM3.0
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
pretty much any next gen game supports SM3.0, and YES the 6800 will run it fine as long as the dev of the game programs, codes, optimizes, whatever to ensure that the game uses SM3.0 efficiently

I dont understand why people buy X800's now...well, thats my opinion...thats how important SM3.0 is to me
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: hans030390
pretty much any next gen game supports SM3.0, and YES the 6800 will run it fine as long as the dev of the game programs, codes, optimizes, whatever to ensure that the game uses SM3.0 efficiently

I dont understand why people buy X800's now...well, thats my opinion...thats how important SM3.0 is to me


This is exactly what I'm trying to get people to do. You saw the evidence and made up your own mind. For you SM3.0 is a must have feature. Enjoy your 6600.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Bar81
You gave an opinion without flaming, good for you. Except the thread HAS NOTHING TO DO WITH YOUR OPINION. If you are making an informed decision as to SM3.0 and a new graphic card purchase that's all I ask. Enjoy your 6800 card.


As to the list, it only includes what's been announced. If there are SM3.0 supporting games that I haven't mentioned please tell me what they are.

Let me politely also point out that the thread HAS NOTHING TO DO WITH YOUR OPINION either IF it has nothing to do with mine.

and i believe you are completely wrong here - this thread is ALL about opinion . . . . some more informed than others. . . .

i am currently enjoying my el cheapo 9800xt . . . . when i consider it time to upgrade, it'll probably be a 6800gt or ultra . . . by then it'll also be clear to you that SM 3.0 is the way to go.

ALL future 3D FPS games will include SM 3.0 support . . . . at least those as "current" as soon-to-be-released Xbox II games . . . games that started development 3 years ago may not.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: apoppin
Originally posted by: Bar81
You gave an opinion without flaming, good for you. Except the thread HAS NOTHING TO DO WITH YOUR OPINION. If you are making an informed decision as to SM3.0 and a new graphic card purchase that's all I ask. Enjoy your 6800 card.


As to the list, it only includes what's been announced. If there are SM3.0 supporting games that I haven't mentioned please tell me what they are.

Let me politely also point out that the thread HAS NOTHING TO DO WITH YOUR OPINION either IF it has nothing to do with mine.

and i believe you are completely wrong here - this thread is ALL about opinion . . . . some more informed than others. . . .

i am currently enjoying my el cheapo 9800xt . . . . when i consider it time to upgrade, it'll probably be a 6800gt or ultra . . . by then it'll also be clear to you that SM 3.0 is the way to go.

ALL future 3D FPS games will include SM 3.0 support . . . . at least those as "current" as soon-to-be-released Xbox II games . . . games that started development 3 years ago may not.

What exactly is my opinion that I've stated? Of course this thread has *nothing* to do with my opinion and I've never claimed otherwise. I don't want anyone to buy ATI or nvidia. I want people to see the facts about SM3.0 games and decide *for themselves* whether SM3.0 is a must have feature with the current generation of cards.

Therein lies the issue. you realize that the XBox II is rumored to sport not only an AT R520 but dual CPUs. A game with SM3.0 enabled on that beast won't necessarily run acceptably on current generation 6800 cards. Again, we have *no idea* and it's all speculation which is why I set up this thread to deal with what we *do* know as opposed to everyone giving their opinion as to what will happen in the unknown future.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I'm drawing a conclusion that we (or I) dont know enough to make an informed decision. People who bought current gen nvidia cards must think it's important to have sm3, but how the cards will run sm3 code in future games is yet to be determined. I personally don't have an sm3 card, and I'm not worried about it because sm3 is not important to me. (I'm too busy playing GT4 anyways, lol).
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Here is a pretty interesting article that discusses SM 3.0 in detail
NVIDIA Shows Off the GeForce 6800 Advantage: Shader Model 3.0 Exposed
Graphics experts consider Shader Model 3.0 as a more advanced superset of the Shader Model 2.0 with more sophisticated approach for programming, which may encourage game-developers to create higher-quality titles in terms of graphics. It should be mentioned that basically all SM 3.0 effects can be made using SM 2.0 and are likely to be made using 2.0 because of broader compatibility. Though, because of performance advantage game-developers are projected to use SM 3.0 for supporting hardware in order to allow their games to perform better. Moreover, eventually those, who have SM 3.0-supporting hardware may see some image quality benefits too.

Back in 2001 NVIDIA downplayed the importance of Pixel Shaders 1.4 developed by ATI saying that game-developers would jump on shaders 1.1 and yet-to-be-release 2.0. In early 2003 NVIDIA encouraged game-developers to use 1.4 instead of 1.1 and 2.0 because of dramatic speed-advantage. What did not bring a boost for the RADEON 8500 brought an advantage for the GeForce FX that came out 1.5 years later.

All in all, Shader Model 3.0 is definitely a feature for considerations now, even though its broad use seems to be 6-12 months away?

That article is 6 months old . . . . Games using SM 3.0 "natively" are less than a year away . . . . . . upgrade to a non-SM 3.0 card at your own risk

and don't say i didn't warn you
 

MisterChief

Banned
Dec 26, 2004
1,128
0
0
How about I email nVidia and see what they say? I'll just mention that the largest flame war ever inspired me to write to them. I'm sure they will be very cooperative...
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: MisterChief
How about I email nVidia and see what they say? I'll just mention that the largest flame war ever inspired me to write to them. I'm sure they will be very cooperative...


That would actually be really cool. If you do do it please ask them for a list of SM3.0 supporting games out now and scheduled. I looked everywhere on their site for a list and couldn't find one.
 

MisterChief

Banned
Dec 26, 2004
1,128
0
0
Originally posted by: Bar81
Originally posted by: MisterChief
How about I email nVidia and see what they say? I'll just mention that the largest flame war ever inspired me to write to them. I'm sure they will be very cooperative...


That would actually be really cool. If you do do it please ask them for a list of SM3.0 supporting games out now and scheduled. I looked everywhere on their site for a list and couldn't find one.

I seriously doubt they will respond, but it's worth a try:)
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: apoppin
Here is a pretty interesting article that discusses SM 3.0 in detail
NVIDIA Shows Off the GeForce 6800 Advantage: Shader Model 3.0 Exposed
Graphics experts consider Shader Model 3.0 as a more advanced superset of the Shader Model 2.0 with more sophisticated approach for programming, which may encourage game-developers to create higher-quality titles in terms of graphics. It should be mentioned that basically all SM 3.0 effects can be made using SM 2.0 and are likely to be made using 2.0 because of broader compatibility. Though, because of performance advantage game-developers are projected to use SM 3.0 for supporting hardware in order to allow their games to perform better. Moreover, eventually those, who have SM 3.0-supporting hardware may see some image quality benefits too.

Back in 2001 NVIDIA downplayed the importance of Pixel Shaders 1.4 developed by ATI saying that game-developers would jump on shaders 1.1 and yet-to-be-release 2.0. In early 2003 NVIDIA encouraged game-developers to use 1.4 instead of 1.1 and 2.0 because of dramatic speed-advantage. What did not bring a boost for the RADEON 8500 brought an advantage for the GeForce FX that came out 1.5 years later.

All in all, Shader Model 3.0 is definitely a feature for considerations now, even though its broad use seems to be 6-12 months away?

That article is 6 months old . . . . Games using SM 3.0 "natively" are less than a year away . . . . . . upgrade to a non-SM 3.0 card at your own risk

and don't say i didn't warn you

Look, I hope those games are coming. I just don't understand, if they are 6 months away why no one is even preliminarily declaring SM3.0 support and why nvidia who hyped the feature all over the place doesn't even have a list of SM3.0 enabled games on its site to reassure buyers of the 6800 that SM3.0 native games are coming and soon.

 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: ZobarStyl
but I doubt (with Far Cry as my basis) that SM3.0 in 6xxx cards is the same as DX9 in 5xxx cards, or we would've seen a massive downturn in performance in Anand's review. My two cents.
I guess the primary question is, then, is the performance dip when "2.0++" mode is enabled in Chronicles of Riddick", simply poor coding/optimization for SM3.0 on the part of the developers, or is it truely the harbinger of things to come, in terms of SM3.0 core performance on current-gen GPU parts? That's the key, and at this point, with only a single "real" data-point, it's very hard to say. (IOW, it is a bit dangerous to extrapolate, but I've made a few "what if" posts recently along that same line of thinking.) I have no idea of the UE3 tech demo shown, was more CPU- or GPU- heavy, or really any details at all about it. I would be interested in finding out some more. More data-points for SM3.0 performance comparison will help too.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: VirtualLarry
Originally posted by: ZobarStyl
but I doubt (with Far Cry as my basis) that SM3.0 in 6xxx cards is the same as DX9 in 5xxx cards, or we would've seen a massive downturn in performance in Anand's review. My two cents.
I guess the primary question is, then, is the performance dip when "2.0++" mode is enabled in Chronicles of Riddick", simply poor coding/optimization for SM3.0 on the part of the developers, or is it truely the harbinger of things to come, in terms of SM3.0 core performance on current-gen GPU parts? That's the key, and at this point, with only a single "real" data-point, it's very hard to say. (IOW, it is a bit dangerous to extrapolate, but I've made a few "what if" posts recently along that same line of thinking.) I have no idea of the UE3 tech demo shown, was more CPU- or GPU- heavy, or really any details at all about it. I would be interested in finding out some more. More data-points for SM3.0 performance comparison will help too.


In complete agreement. Thanks for the excellent post.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: stnicralisk
Nvidia is trying a new technology. ATi can bash it all they want, come this time next year they will be preaching about how good it is (bc theyll have that feature also.)
As much as Rollo might believe me to be some sort of ATI supporter, I have to fully and totally agree with the above statement. The truth is, the marketing "spin" depts. of both NV and ATI are pretty bad, and will say anything to appear competitive, in either camp. (Hence the proliferation of paper-launches we've seen as well.)
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: Rage187
Bar81, aren't you the same guy who screamed and cried that his 6800gt didnt do 1600x1200 over DVI, then found out it actually did and played it off?
FWIW, though - didn't he do the responsible thing, and contact the vendor before "spouting off", and BFG confirmed (incorrectly) the limitation - but in the end, it turned out to be a BIOS/driver compatibility issue, in conjunction with the EDID timing data reported by a certain subset of LCDs on the market? IOW, NV and the LCD mfg were eventually responsible, and his initial report that it plain didn't work, wasn't entirely true, but in his particular instance it was? Playing with the "bleeding edge" of tech sometimes does that.
Originally posted by: Rage187
Didn't you even call a bunch of board members stupid for arguing with you?
Let me see if I can find the thread, it was a classic troll thread as is this one.
Well, at the time, I didn't think it was much of a "troll thread", it turned out that it was a (limited) compatibility problem. But I daresay that publicizing the issue, helped get it resolved and understood by the community, even if the initial premise did turn out to be not 100% correct.

He shouldn't shoulder all of the "blame" (and I don't think that any end-user deserves any for this issue, personally) - if you want to "blame" anyone about that incident, it should be NV for their reference BIOS/drivers, and BFG's tech-support not knowing the whole picture.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: VirtualLarry
Originally posted by: Rage187
Bar81, aren't you the same guy who screamed and cried that his 6800gt didnt do 1600x1200 over DVI, then found out it actually did and played it off?
FWIW, though - didn't he do the responsible thing, and contact the vendor before "spouting off", and BFG confirmed (incorrectly) the limitation - but in the end, it turned out to be a BIOS/driver compatibility issue, in conjunction with the EDID timing data reported by a certain subset of LCDs on the market? IOW, NV and the LCD mfg were eventually responsible, and his initial report that it plain didn't work, wasn't entirely true, but in his particular instance it was? Playing with the "bleeding edge" of tech sometimes does that.
Originally posted by: Rage187
Didn't you even call a bunch of board members stupid for arguing with you?
Let me see if I can find the thread, it was a classic troll thread as is this one.
Well, at the time, I didn't think it was much of a "troll thread", it turned out that it was a (limited) compatibility problem. But I daresay that publicizing the issue, helped get it resolved and understood by the community, even if the initial premise did turn out to be not 100% correct.

He shouldn't shoulder all of the "blame" (and I don't think that any end-user deserves any for this issue, personally) - if you want to "blame" anyone about that incident, it should be NV for their reference BIOS/drivers, and BFG's tech-support not knowing the whole picture.




THANK YOU. FINALLY someone who understands.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Bar81
Originally posted by: apoppin
Here is a pretty interesting article that discusses SM 3.0 in detail
NVIDIA Shows Off the GeForce 6800 Advantage: Shader Model 3.0 Exposed
Graphics experts consider Shader Model 3.0 as a more advanced superset of the Shader Model 2.0 with more sophisticated approach for programming, which may encourage game-developers to create higher-quality titles in terms of graphics. It should be mentioned that basically all SM 3.0 effects can be made using SM 2.0 and are likely to be made using 2.0 because of broader compatibility. Though, because of performance advantage game-developers are projected to use SM 3.0 for supporting hardware in order to allow their games to perform better. Moreover, eventually those, who have SM 3.0-supporting hardware may see some image quality benefits too.

Back in 2001 NVIDIA downplayed the importance of Pixel Shaders 1.4 developed by ATI saying that game-developers would jump on shaders 1.1 and yet-to-be-release 2.0. In early 2003 NVIDIA encouraged game-developers to use 1.4 instead of 1.1 and 2.0 because of dramatic speed-advantage. What did not bring a boost for the RADEON 8500 brought an advantage for the GeForce FX that came out 1.5 years later.

All in all, Shader Model 3.0 is definitely a feature for considerations now, even though its broad use seems to be 6-12 months away?

That article is 6 months old . . . . Games using SM 3.0 "natively" are less than a year away . . . . . . upgrade to a non-SM 3.0 card at your own risk

and don't say i didn't warn you

Look, I hope those games are coming. I just don't understand, if they are 6 months away why no one is even preliminarily declaring SM3.0 support and why nvidia who hyped the feature all over the place doesn't even have a list of SM3.0 enabled games on its site to reassure buyers of the 6800 that SM3.0 native games are coming and soon.

Ok . . . now you ARE asking for my opinion.

ATI has already declared support for SM 3.0 in its next core - This Spring . . . . it becomes a "non-issue" for nVidia and future games since it is a universal feature. . . . game-developers will use SM 3.0 to make their games perform better and see some IQ benefits also. . . . it is clear from SM 3.0 demos . . . . not as clear from "SM 3.0 patched" games. ;)

BTW, thanks for maintaining a civil debate over the last few posts with me . . . .