Shader model 3.0 (a civilized discussion)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MisterChief

Banned
Dec 26, 2004
1,128
0
0
BOOM! Let the war continue...

You all must understand that ATI must release cards that support SM 3.0 in order to stay competitive (or alive) in the near future.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: sellmen
Originally posted by: xtknight
Originally posted by: sellmen
Originally posted by: xtknight
Originally posted by: sellmen
Originally posted by: hans030390
Remember that the x800xl only has Sm2.0...so performance of today's games might be better, but SM3.0 DOES look better and WILL be needed for next gen games (unless you prefer the crappy old style graphics of 2005 lol)

Sm3.0 has displacement mapping. read my first post to know what it is. SM2.0 doesnt have that, and next gen games rely HEAVILY on it.

Remember, there are two sides to this. If you are going to buy a card, and don't upgrade often, you need to either wait for the R520 cards (which are uber expensive) or get a 6xxx card to be safe for the future. If you upgrade every year, it hardly matters what you get. X800 or 6800, they'll both do fine.

Now, you must understand that new ati and nvidia cards will most likely be out soon, sothe 6xxx cards will be outdated. BUT, i do not want to talk about next gen cards, as they are really just rumors about their specs and what not so far.

One point i want to make, the 6xxx will do better and look better than the x800 cards will in next gen games. Will the 6xxx do great? I don't know, maybe not. That is for next gen cards. You are generally assured more "future-proofness" with a SM3.0 card.

um yeah...totally forgot what i was gonna say, but keep up the discussion, maybe i'll remember ;)

If you are saying that a 6800GT may be faster in some games than a X800XL in two years due to SM3.0, I'd agree with you. In that way, the 6800GT is more future proof than the X800XL.

If you are saying that a 6600GT is more future proof than an X800XL - which it seems you are - I'd say you are completely wrong. The raw power of the X800XL makes it a much more future proof card, SM3.0 or not.

assuming the game has SM2.x fallback shaders:

if you're talking about performance future-proof, yes.
but visual quality future-proof, no.
but then again, the 6600GT might not be able to take advantage of its extra visual quality if it doesn't have the performance to run it, making the feature worthless. it's a big mess.

I'd say its even "visual quality future proof", because an X800XL will let you turn on AA/AF, detail, and resolution settings that a 6600GT won't. Even if a game has no fallback shaders, odds are the game will look better on the X800XL.

You'd certainly miss out on the SM3.0 image quality enhancements though.

I doubt AA/AF will compensate for what SM3.0 will bring. Look at the Far Cry screenshots at HardOCP. I think I'd notice that more than I'd notice a little gray on rough edges. Even without AF I still think the SM3.0 would look better. I don't know...if you don't have AF on, floors can look like crap ahead of you. Detail certainly makes a difference though. It all depends on what the 6600GT can do and what it can't. We'll just have to wait and see.

Let me just copy-paste part of Crieg's post regarding Far Cry:

Q: 7) What aspects of the screenshots seen at the launch event are specific examples of the flexibility and power of Shader 3.0?

A: In current engine there are no visible difference between PS2.0 and PS3.0. PS3.0 is used automatically for per-pixel lighting depending on some conditions to improve speed of rendering.

Q: 8) Is the same level of image quality seen when using Shader 3.0 possible using Shader 2.0? If so, what dictates which Shader you decide to use?

A: In current generation engine quality of PS3.0 is almost the same as PS2.0. PS3.0 is used for performance optimization purposes.

So the Far Cry screens are not useful for comparing IQ differences.

I doubt that SM3.0 will provide an image quality enhancement that is greater than AA, AF, detail levels, and resolution combined.

http://www.hardocp.com/article.html?art=NjA5

The following before-and-after images are from the CryEngine. Developer Crytek uses Shader Model 3.0 techniques (vs. 1.x shaders) to add more depth and realism to the scenes. Notice the more realistic look and feel when SM 3.0 is applied to the bricks and stones that make up the staircase. In the scene featuring the Buddha, the full image comes to life with the use of SM 3.0, with the technique applied to multiple objects.

can someone explain what these screenshots are, if there is "no visible difference between PS2.0 and PS3.0"?

Were then the visible image improvements in the screenshots a byproduct of "Vertex Shader 3.0"?

hmm i just realized it's comparing SM3.0 to SM1.1. now i'm really confused. :confused: so Far Cry originally used only SM1.1?

In current engine there are no visible difference between PS2.0 and PS3.0.

isn't that implying they used SM2.0?

Originally posted by: Creig
http://www.nordichardware.com/Articles/?page=2&skrivelse=346 (Simple Far Cry tests running Sm3.0...Shows that it slightly improves performance, while also improving eye candy)
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
;) LOL@ above where someone said the X800 be more futureproof as of raw power, (in that case get a old g -force 2 and vapourchill it and o/c it to 800/1600, ok that was just a made up totally crap example), but a card wont be futureproof as of high mhz it dont have the architecture extentions to run new games in dx9c p.s 3.0 it wont run them well at any overclock.

A X800 isnt even today proof as some new games want dx9c like Band Of Bros, im not saying you cant play on non nvidia card, take that up with Ubisoft.

Lets wait till april to fairly compair newest nvidia to the X850, as N40 is 10months old.
 

sellmen

Senior member
May 4, 2003
459
0
0
Originally posted by: xtknight
Originally posted by: sellmen
Originally posted by: xtknight
Originally posted by: sellmen
Originally posted by: xtknight
Originally posted by: sellmen
Originally posted by: hans030390
Remember that the x800xl only has Sm2.0...so performance of today's games might be better, but SM3.0 DOES look better and WILL be needed for next gen games (unless you prefer the crappy old style graphics of 2005 lol)

Sm3.0 has displacement mapping. read my first post to know what it is. SM2.0 doesnt have that, and next gen games rely HEAVILY on it.

Remember, there are two sides to this. If you are going to buy a card, and don't upgrade often, you need to either wait for the R520 cards (which are uber expensive) or get a 6xxx card to be safe for the future. If you upgrade every year, it hardly matters what you get. X800 or 6800, they'll both do fine.

Now, you must understand that new ati and nvidia cards will most likely be out soon, sothe 6xxx cards will be outdated. BUT, i do not want to talk about next gen cards, as they are really just rumors about their specs and what not so far.

One point i want to make, the 6xxx will do better and look better than the x800 cards will in next gen games. Will the 6xxx do great? I don't know, maybe not. That is for next gen cards. You are generally assured more "future-proofness" with a SM3.0 card.

um yeah...totally forgot what i was gonna say, but keep up the discussion, maybe i'll remember ;)

If you are saying that a 6800GT may be faster in some games than a X800XL in two years due to SM3.0, I'd agree with you. In that way, the 6800GT is more future proof than the X800XL.

If you are saying that a 6600GT is more future proof than an X800XL - which it seems you are - I'd say you are completely wrong. The raw power of the X800XL makes it a much more future proof card, SM3.0 or not.

assuming the game has SM2.x fallback shaders:

if you're talking about performance future-proof, yes.
but visual quality future-proof, no.
but then again, the 6600GT might not be able to take advantage of its extra visual quality if it doesn't have the performance to run it, making the feature worthless. it's a big mess.

I'd say its even "visual quality future proof", because an X800XL will let you turn on AA/AF, detail, and resolution settings that a 6600GT won't. Even if a game has no fallback shaders, odds are the game will look better on the X800XL.

You'd certainly miss out on the SM3.0 image quality enhancements though.

I doubt AA/AF will compensate for what SM3.0 will bring. Look at the Far Cry screenshots at HardOCP. I think I'd notice that more than I'd notice a little gray on rough edges. Even without AF I still think the SM3.0 would look better. I don't know...if you don't have AF on, floors can look like crap ahead of you. Detail certainly makes a difference though. It all depends on what the 6600GT can do and what it can't. We'll just have to wait and see.

Let me just copy-paste part of Crieg's post regarding Far Cry:

Q: 7) What aspects of the screenshots seen at the launch event are specific examples of the flexibility and power of Shader 3.0?

A: In current engine there are no visible difference between PS2.0 and PS3.0. PS3.0 is used automatically for per-pixel lighting depending on some conditions to improve speed of rendering.

Q: 8) Is the same level of image quality seen when using Shader 3.0 possible using Shader 2.0? If so, what dictates which Shader you decide to use?

A: In current generation engine quality of PS3.0 is almost the same as PS2.0. PS3.0 is used for performance optimization purposes.

So the Far Cry screens are not useful for comparing IQ differences.

I doubt that SM3.0 will provide an image quality enhancement that is greater than AA, AF, detail levels, and resolution combined.

http://www.hardocp.com/article.html?art=NjA5

The following before-and-after images are from the CryEngine. Developer Crytek uses Shader Model 3.0 techniques (vs. 1.x shaders) to add more depth and realism to the scenes. Notice the more realistic look and feel when SM 3.0 is applied to the bricks and stones that make up the staircase. In the scene featuring the Buddha, the full image comes to life with the use of SM 3.0, with the technique applied to multiple objects.

can someone explain what these screenshots are, if "sm3.0 offers no visual improvement over sm2.0"?

hmm i just realized it's comparing SM3.0 to SM1.1. now i'm really confused. :confused: so Far Cry originally used only SM1.1?

In current engine there are no visible difference between PS2.0 and PS3.0.

isn't that implying they used SM2.0?

Originally posted by: Creig
http://www.nordichardware.com/Articles/?page=2&skrivelse=346 (Simple Far Cry tests running Sm3.0...Shows that it slightly improves performance, while also improving eye candy)

From Crieg's post above:

Additionally, NVIDIA recently released screenshots purported to show the difference between PS30 and PS1x, but in a later interview with NVIDIA's VP of Technical Marketing Tony Tamasi, he said, "Yeah, the images that you've seen from Far Cry, the current path, those are actually Shader Model 2.0, and anything that runs Shader Model 2.0 should be able to produce those images." In fact, there is no game currently available that uses SM30, and games that support the features of SM30 aren't expected to show up this year.

That's referring to the Far Cry screen shots; the comparison was never valid.

LOL@ above where someone said the X800 be more futureproof as of raw power, (in that case get a old g -force 2 and vapourchill it and o/c it to 800/1600, ok that was just a made up totally crap example), but a card wont be futureproof as of high mhz it dont have the architecture extentions to run new games in dx9c p.s 3.0 it wont run them well at any overclock.

A X800 isnt even today proof as some new games want dx9c like Band Of Bros, im not saying you cant play on non nvidia card, take that up with Ubisoft.

Lets wait till april to fairly compair newest nvidia to the X850, as N40 is 10months old.

I was referring to the X800XL versus the 6600GT. Why don't you link some benchmarks showing how the X800 isn't "today proof"? I'd like to see how X800 can't run today's games "at any overclock".
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
1ST i never said it couldnt play 2days games, i said not futureproof it not even dx9c

2ND plz dont flood screen with lots of quotes its PIA and i feel for 56k users trying to load 2 screens of repeats. :D
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: sellmen
Additionally, NVIDIA recently released screenshots purported to show the difference between PS30 and PS1x, but in a later interview with NVIDIA's VP of Technical Marketing Tony Tamasi, he said, "Yeah, the images that you've seen from Far Cry, the current path, those are actually Shader Model 2.0, and anything that runs Shader Model 2.0 should be able to produce those images." In fact, there is no game currently available that uses SM30, and games that support the features of SM30 aren't expected to show up this year.

That's referring to the Far Cry screen shots; the comparison was never valid.

oh ok, thanks for clearing that up.
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
There is thread for sm 3.0 games and i think new Splintercell is one them and thats end this month, COD2 and Bloodrayne2 also end of month but no clue on there configuration.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
I'm actually not sure if the x800 will perform better than a 6600gt in future games...the 6600gt will look nicer, and with 3.0, it's supposed to take
less power to look that nice (2.0 has to work really really hard to make it look like 3.0) so actually, i think the x800 (if set to look as nice as the 6600gt) will either fall behind or be about equal...that's what i was trying to make of what i said :-\ sorry
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: Creig
Originally posted by: housecat
Originally posted by: Noob
Do you guys think the 6800's will even have the power to run next generation games that are SM 3.0 supported?

I do.
I can run insane resolutions with HDR on in Farcry today.

You're running a pair of 6800GT OC's in SLI, not a single GT6800.

Well he said "6800s".
I have 6800s.

2nd revision SM3? What exactly is THAT supposed to be? SM3.0 is SM3.0, the specs don't change simply because there's a new core design.

Well, its called further optimization of the SM3 hardware NV developed long before ATI.

Its their 2nd go round on retail hardware with SM3.
ATI's first.. odds are NV's will be better implemented and faster.


As neither next generation card have been released yet it's way too early to start making predictions which card will be better.

Isnt this thread all about predictions?

Or are predictions that favor Nvidia (because all signs DO point to a continued NV domination) not allowed?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: humey
;) LOL@ above where someone said the X800 be more futureproof as of raw power, (in that case get a old g -force 2 and vapourchill it and o/c it to 800/1600, ok that was just a made up totally crap example), but a card wont be futureproof as of high mhz it dont have the architecture extentions to run new games in dx9c p.s 3.0 it wont run them well at any overclock.

Ok so you'd rather play SM3.0 games at 20-30 frames vs. SM2.0 games at 50-60 frames with X800xL? :Q X800XL VS. 6600GT

Also just because a card supports some features, doenst mean anything. Analogy: What you are saying is that Intel Acceleration graphics 2 is better than Geforce 4200 because it supports DX9.0 features but has less processing raw power? So you'd rather take latest features and image quality over raw processing power and have your gaming choppy?

Saying that 6600GT is more futureproof than X800xl is simply ridiculous. Apparently 99% of everyone here missed my post about comparison of raw processing performance in most shader intensive games today where ATI wins. If history is anything to go by (ie. 9800Pro vs 5900U), any card that performs faster in most shader intensive games is more futureproof regardless of any features it supports or not. In fact, I think it's better to have a card with slightly outdated features that can play games smoother, than a card with the latest features but has 2x as less processing power.

You guys are basically saying that 6200/6600/6600GT cards are more futureproof than X800Pro and X800/X850XT cards because they support SM3.0??? lol (having in mind that there is no game that shows any noticeable performance improvement with SM3.0)

Originally posted by: hans030390
I'm actually not sure if the x800 will perform better than a 6600gt in future games...the 6600gt will look nicer, and with 3.0, it's supposed to take
less power to look that nice (2.0 has to work really really hard to make it look like 3.0) so actually, i think the x800 (if set to look as nice as the 6600gt) will either fall behind or be about equal...that's what i was trying to make of what i said :-\ sorry

That would only be true if Nvidia cards performed faster in shader intensive applications. SM3.0 brings in more complexity to graphics. Based on all latest games, ATI cards are faster in most graphically intensive games (except doom 3). So where is the logic that explains that cards which are slower in less graphically intensive SM2.0 mode are going to all of a sudden be faster in SM3.0 mode?
 

bcoupland

Senior member
Jun 26, 2004
346
0
76
I think that this discussion is somewhat irrelevant. When UE3 comes out, there will very likely be a $200 midrange card that will run UE3 like a 6600GT runs HL2. Upgrading now for a game that, at best, is about a year away, is not very wise. Remember all of those people that thought they would run HL2 great because they had a 5900Ultra that supported DX9? Remember that DX9 games only really came out 2 years after the 9700Pro was released? Also, now that DX9 games have just recently came out in full force, a 6600GT or even a X700Pro outperforms the 9700Pro. It is quite easily imaginable that in a year, a midrange card from the next generation will be more or less equal to a X800XT/6800Ultra, and have more features.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: housecat
Originally posted by: Creig
Originally posted by: housecat
Originally posted by: Noob
Do you guys think the 6800's will even have the power to run next generation games that are SM 3.0 supported?

I do.
I can run insane resolutions with HDR on in Farcry today.

You're running a pair of 6800GT OC's in SLI, not a single GT6800.

Well he said "6800s".
I have 6800s.

[grammar nazi]
i'm guessing he was implying "the 6800s" as in "6800 series", not a pair of 6800s in SLI. if he was implying a pair of 6800s there wouldn't be an article before "6800s", unless they were previously referred to, which they were not. and, regardless of that, a pair of 6800s isn't one product, it's two. typically two products are not compared to one, that would be unfair.
[/grammar nazi]

anyway...nothing much more to discuss about SM3.0. i think we've thoroughly beaten it to a pulp.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
To all, mostly russian sensation
OK, say you are playing a game based off of Ue3. OK, you have a...6800 of some sort (i realized that the 6600gt wouldnt fit for my example) and an x800 of some sort. If you played UE3 in ONLY SM2.0 (same settings) with both cards, the x800 would win most likely. BUT, if you let the x800 run at UBER graphics (only 2.0 though) and the 6800 do its uber graphics (on 3.0) i am thinking that you wouldn't see much of a performance difference between the cards, though, the 6800 might be a little slower (oh boy...it might run at 30fps, which is the STANDARD for console games) but in the end it would look alot nicer and you wouldnt see a huge hit (as compared to how much graphics it would add.) That is the basic
concept towards Sm3.0...add eye candy, improve performance. It's already slightly available with Far Cry, which was build up from 2.0, so when we see games that are built from the ground up with 3.0 in mind, then we will see the importance of it.

I think most future games will require you to play it on 3.0 if you even want it to look decent. But, as i always say, if you upgrade alot, don't worry about 2.0 and 3.0 right now. If you are like me, and upgrade once every gaming generation (4 years or so) you need to be as future proof as possible with less money spend. Therefore, i chose a 6600gt, as it performed good but had Sm3.0...i COULD have gotten a....say x800 (vanilla) if i wanted...it's faster, but only supports 2.0

If i was the type who upgrade often, i could care less about 3.0, but i'm not. So i hope you see where I get my thoughts and opinions about it.

Oh yeah, you said "You guys are basically saying that 6200/6600/6600GT cards are more futureproof than X800Pro and X800/X850XT cards because they support SM3.0??? lol (having in mind that there is no game that shows any noticeable performance improvement with SM3.0) "

I saw some place saying that SM3.0 on far cry boosts performance of the 6800 A LOT. It actually boosts it to the point of X800pro framerates...and not only that, it looks BETTER...huh, thats odd, more eye candy with more framerate!!! woot! sm3.0 is great, and yes there is a difference. Can you imagine what games will be like that are built up from the ground with 3.0?

When games like that are out, I wanna see someone do a test on some type of 6xxx card (preferably 6800gt or ultra) and run it in Sm2.0, 2.1, and 3.0 and see what the difference in framerate, and then compare that to how much eye candy each step up adds. I think we'll all be surprised.

Well, i was redundant there, sorry. As for now, keep up the discussion...good thing it isn't a flame war like that last SM3.0 thread ;)

and lets not get into SLI :-
 

leedog2007

Senior member
Nov 4, 2004
396
0
0
Does anyone remember when DX9 first appeared as an option in graphics cards? I bought a 9500pro (awesome card btw) on the assumption that it would run DX9 games at the settings I wanted. The first game that actually fully used DX9 came out about a year and a half after that, and I found I couldn't run the games at the settings I wanted, so I upgraded to a 9800pro. My point is that the SM3 debate seems alot like the DX9 debate. By the time the unreal 3 engine is out the midrange SM3 cards will not have enough horsepower to run it at desirable settings. Thats my 2 cents.
 

ponyo

Lifer
Feb 14, 2002
19,688
2,811
126
I saw some place saying that SM3.0 on far cry boosts performance of the 6800 A LOT. It actually boosts it to the point of X800pro framerates...and not only that, it looks BETTER...huh, thats odd, more eye candy with more framerate!!! woot! sm3.0 is great, and yes there is a difference. Can you imagine what games will be like that are built up from the ground with 3.0?

FarCry doesn't look any "BETTER" with SM3.0. It does give slight boost in framerates. But it does nothing to the eye candy.

Nothing wrong with saying 6600GT is better buy than X800XL due to the price but saying 6600GT is more future proof than X800XL because it supports SM3.0 is just ridiculous. If you fall for SM3.0 marketing on lower end cards, you're just as dumb as people who bought FX5200 over Geforce 4 Ti4200 thinking it was better gaming card because it supported DX9.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: hans030390
To all, mostly russian sensation
OK, say you are playing a game based off of Ue3. OK, you have a...6800 of some sort (i realized that the 6600gt wouldnt fit for my example) and an x800 of some sort. If you played UE3 in ONLY SM2.0 (same settings) with both cards, the x800 would win most likely.

How can you say most likely!?!? Check the current comparisons of the X800XL vs. the 6600GT - the X800XL with it's double the pipelines is almost twice as fast in GPU power limited situations (see RussianSensation's link comparing Far Cry 1.3). Pick the fights you can win. Will a 6800GT be faster than an X800XL at Unreal 3? Most likely (it's all conjecture at this point, however). Will a 6600GT be faster than an X800XL? I highly doubt it, unless you can explain to me exactly why this is. What features in SM3.0 allow it to perform as fast as a card with twice as much shader power and pixel pipelines? How does SM3.0 allow a 6600GT to overcome having half of the memory bandwidth and pixel pipelines of an X800XL?

BUT, if you let the x800 run at UBER graphics (only 2.0 though) and the 6800 do its uber graphics (on 3.0) i am thinking that you wouldn't see much of a performance difference between the cards, though, the 6800 might be a little slower (oh boy...it might run at 30fps, which is the STANDARD for console games) but in the end it would look alot nicer and you wouldnt see a huge hit (as compared to how much graphics it would add.) That is the basic
concept towards Sm3.0...add eye candy, improve performance.

From what I've read, SM 3.0 is just essentially a way to do SM 2.0 smarter and faster, that's it. It's not like the difference between PS 1.x and 2.0 at all.

Also, SM 3.0 is not a 50-100% improvement over SM 2.0 in performance, again from what I have read. A more realistic number would be 20-30% faster.

You keep thinking that a 6600GT with SM 3.0 will be faster than an X800XL with nearly twice the shader power but only SM 2.0 - it's not going to happen.


Also, "Unreal Engine 3" performance isn't the best thing to compare things on because the friggin engine isn't even done yet! Everything we know is based on hearsay, conjecture and misinformation.

It wouldn't be logical for UE3 to not have any kind of backwards compatibility, and based on those comments of Mark Reign, it sounds like there will be just one way of coding games - the SM 3.0 path. That doesn't mean SM2.0 won't be able to do it, merely that SM2.0 will be slower (which is a given, since SM3.0 is a speed upgrade, not a features upgrade).

 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
in a way the SM3.0 IS a feature upgrade. it allows for longer shaders and can thus result in better image quality.

edit: well never mind i think i know what you're saying. 2 sm2.0 shaders can be made into one sm3.0 and the only advantage is speed.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Um, no, Sm3.0 allows like...65000 somethings...i forget the term lol, as 2.0 supports 512...so yeah

REMEMBER!!! Sm3.0 supports displacement mapping, and 2.0 doesnt! Dmapping will be used HEAVILY in next gen games! So 2.0 might try to look like it, and it will go slow, while 3.0 will have less of a problem making it look like that.

Displacement mapping- instead of making the object appear to have 3d bumps, as in bump mapping, Dmapping actually physically changes the geometry of that object it is applied to. What you get is an extrememly detailed model without all that extra polys and modelling, but with real bumps...not only that, the shadows show the Dmapping bumps as well.

don't say 2.0 and 3.0 are the same, and that 3.0 is just more effiecient with no eye candy...read up on it before you reply, that was the purpose of my first post.

actually, far cry did have a change with 3.0. It rendered lights differently on objects and i think it applied it to foliage. It made a slight change, but you won't notice the real 2.0 to 3.0 difference unless the game uses all of the 3.0 features.
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
ti 4200 was a mid ranged dx8.1 card and fx5200 wasnt it was bottom/middle, and rumours say it ran dx9 games in dx8.1 as it was POS same as rest of FX's hense i never got 5800/5900 or 5950 and thank god i didnt. :)
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Don't involve the FX cards in this discussion...i think Nvidia learned from that ;)

and just to let you know, if you get a program to tell HL2 that it is running off of a 9800 (if you have a 5900) it will instantly play the game in DX9 with no problems

i think the FX cards were just misunderstood, as it can play DX9 games

but that is off topic, don't start talking about the FX cards please
 

pyrosity

Member
Dec 20, 2004
42
0
0
Originally posted by: hans030390
Remember that the x800xl only has Sm2.0...so performance of today's games might be better, but SM3.0 DOES look better and WILL be needed for next gen games (unless you prefer the crappy old style graphics of 2005 lol)

Sm3.0 has displacement mapping. read my first post to know what it is. SM2.0 doesnt have that, and next gen games rely HEAVILY on it.

The only difference between 2.0 and 3.0 is instruction lengths allowed and other shader programming differences. None of this results in new features. Everything that 3.0 does, 2.0 can do as well. That's including displacement mapping. I believe that Epic claims d-mapping as "3.0 exclusive" because they couldn't have pulled it off if they didn't have the speed boost of 3.0. The lighting differences shown in Far Cry 1.3 are done by HDR which you know is supported in PS 2.0. Every visual effect and improvement made is not done by switching/changing some shader language; it is done by adding different effects that come from the 2.0 era.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: pyrosity
Originally posted by: hans030390
Remember that the x800xl only has Sm2.0...so performance of today's games might be better, but SM3.0 DOES look better and WILL be needed for next gen games (unless you prefer the crappy old style graphics of 2005 lol)

Sm3.0 has displacement mapping. read my first post to know what it is. SM2.0 doesnt have that, and next gen games rely HEAVILY on it.

The only difference between 2.0 and 3.0 is instruction lengths allowed and other shader programming differences. None of this results in new features. Everything that 3.0 does, 2.0 can do as well. That's including displacement mapping. I believe that Epic claims d-mapping as "3.0 exclusive" because they couldn't have pulled it off if they didn't have the speed boost of 3.0. The lighting differences shown in Far Cry 1.3 are done by HDR which you know is supported in PS 2.0. Every visual effect and improvement made is not done by switching/changing some shader language; it is done by adding different effects that come from the 2.0 era.

This is so misinformed.

1. PS3.0 offers great efficiency and will offer much higher performance than PS2.0

2. Effects will be easier to implement, and those same effects less stressing on the hardware through PS3.0

3. PS3.0 offers full precision (32bit floating point), while 2.0 does not (24bit). Anyone who has studied Nvidia's new pixel shaders will know that it has PLENTY of power to use this as well.

4. VS (vertex shader) 3.0 will bring exponentially greater image quality over VS2.0 and where we'll see the biggest IQ improvements. This feature finally brings us displacement mapping that DX9 originally promised.

Simply put, Shader Model 3.0 is a very big deal and will remain the standard until 2006 when Longhorn and DX10 is released.
I should also note, that as I've stated in the past, Nvidia moving to 32FP with the NV30 was a wise move.
Now, the move to PS3.0 was much easier from an engineering standpoint and their 2 years of experience working with 32FP hardware will likely give them huge performance gains over the ATI PS/VS3.0 hardware whenever that gets here.

Shader model 3.0 opinions from the devs
http://www.gamers-depot.com/interviews/dx9b/001.htm
Shader model 3.0 information (also former shader comparisons)
http://www.elitebastards.com/page.php?pageid=4136&head=1&comments=1
MS and NV SM3 info
http://www.microsoft.com/whdc/winhec/partners/shadermodel30_NVIDIA.mspx
 

BrokenVisage

Lifer
Jan 29, 2005
24,771
14
81
..simply NOT worth $100, I don't care what opinions, benches, or comparisons you throw at me, you simply can't justify the premium you pay for a slightly better then 2.0++ pixel shader, especially at this time. Every game I play looks beautiful and gives me optimal FPS with high settings up the ying-yang, why should I do and buy a card with a triple-digit price difference and SM3.0 when the gain from what I would have with my SM2.0++ compatible card would barely be noticed if not invisible?