• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Why go with SM3.0 today?

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Megatomic

Lifer
Nov 9, 2000
20,127
6
81
Originally posted by: Rollo
2. For the people who actually bought X800s it's sort of a moot point. You don't get to see this stuff, arguing about whether you need it is only rationalization of your purchase.

Originally posted by: Creig
4. For those who bought nV40s (especially SLI), arguing about whether you need it is only rationalization of your purchase.

I'm seeing both sides of the coin here fellas. I'm not sure that there can be anything added to this "conversation" that would be constructive after this.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Oh, it stopped being constructive about 250 posts ago. I'm simply trying to provide a counterpoint to some of the endless rhetoric and misinformation I keep reading here. Hopefully any newcomer who reads these threads will be able to see "both sides of the coin" and make a purchasing decision based on the strengths/weaknesses of both companies product lines.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Guys dont start quoting me on what i heard, i was just saying something. I dont find it hard to believe if the UE3 engine uses only SM3...otherwise, SM2.0 (if put in) would either 1. look "bad" but still run fast, or 2.Try to look like SM3 and run slow.

with UE3, they are heavily using Sm3.0's features, so you can't judge how it will do with SC:CT or Far Cry in SM3.0

Just to be random...there is a japanese arcade game where you stick a finger up someone's rear end....rather funny...

I have an idea...this thread ends, and when next gen games come out, maybe we can all discuss it, as then we will reallly know all about SM3...bye bye...
 

imported_Reverend

Junior Member
Apr 26, 2005
17
0
0
Originally posted by: hans030390
Guys dont start quoting me on what i heard, i was just saying something. I dont find it hard to believe if the UE3 engine uses only SM3...otherwise, SM2.0 (if put in) would either 1. look "bad" but still run fast, or 2.Try to look like SM3 and run slow.

with UE3, they are heavily using Sm3.0's features, so you can't judge how it will do with SC:CT or Far Cry in SM3.0
It would be almost impossible -- there are licensees already developing games using UE3 and they are NOT SM3.0-only!

 

imported_Reverend

Junior Member
Apr 26, 2005
17
0
0
Originally posted by: PeteDoom 3 was based around the capabilities of a GeForce. The GF came out early 2000. D3 came out late 2004. It offers similar IQ on a GF as on a GF6,
Really (i.e. similar IQ) ?

The calculations in ARB_fragment_program are all done in floating point, and general dependent texture reads are used.
 

imported_Reverend

Junior Member
Apr 26, 2005
17
0
0
Originally posted by: Gstanfor
Anti-Aliasing and Anisotropic filtering are strictly optional features. They are not required in order for 3D applications to run. They serve only to enhance IQ.

This is part of the reason why developers are currently enabling SM3.0 in games despite there being no Anti-Aliasing - developers want features andprogrammability, not IQ enhancement.

Wow... really ? You're sure about that?

 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: hans030390
Guys dont start quoting me on what i heard, i was just saying something. I dont find it hard to believe if the UE3 engine uses only SM3...otherwise, SM2.0 (if put in) would either 1. look "bad" but still run fast, or 2.Try to look like SM3 and run slow.

Where are you getting that it would look "bad" from? Does Half-Life 2 look bad simply because it doesn't use SM3.0? No. Does Far Cry look "bad" using SM2.0? No. As as been said repeatedly, SM3.0 isn't really about added eyecandy but about ease of programming and increased performance. I don't know why people keep saying SM3.0 will necessarily look better than SM2.0.

From everything I've read, the biggest advantage of SM3.0 is that it allows looping and branching of shader programs while SM2.0 doesn't. SM2.0 can do nearly everything SM3.0 can do, it's just that longer shader instructions may need to be broken up for SM2.0 to run them without incuring a performance hit. But it can still be done with SM2.0 so there's no reason SM2.0 can't looks just as nice as SM3.0.

Barring "soft shadows", of course. :roll: Reminds me of "shiny pipes".
 

imported_Reverend

Junior Member
Apr 26, 2005
17
0
0
Originally posted by: CreigAs as been said repeatedly, SM3.0 isn't really about added eyecandy but about ease of programming and increased performance.

I don't know why people keep saying SM3.0 will necessarily look better than SM2.0.

I decided to break these two comments of yours into separate quotes.

You rarely see a slow car being made to look like a true sports car. But if that slow car gets an engine upgrade, it is possible that it may be made to look better[/i] compared to its predecessor. Do you know why?

Amazing stuff in this forum. Makes me open my eyes.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
I'm sorry, but I try to deal in facts, figures and correct terminology instead of making comparisons that may or may not directly apply to the situation being described. I am not an expert on this subject by any means, but do try to make sure that what I post is, to the best of my knowledge, correct

If any of my previous statement was in error, please spell out which portion is incorrect and why. Then I can avoid spreading misinformation in the future.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: Creig
Originally posted by: Rollo
For most people, this argument is irrelevant.

1. Most people still own hardware that's too old/slow to wonder about SM2b vs SM3, or soft shadows and HDR.

2. For the people who actually bought X800s it's sort of a moot point. You don't get to see this stuff, arguing about whether you need it is only rationalization of your purchase.

3. For those who bought nV40s (preferably SLI), you get to see all the games as the developers hoped, so all you have to discuss is whether you personally prefer the features on or off. In any case, you're set.

4. For those who bought nV40s (especially SLI), arguing about whether you need it is only rationalization of your purchase.

Don't be a baby.

Creig, we arent the ones downplaying anything. We're saying SM3 is something that everyone should be "going with" today. No reason NOT too.
Besides the constant DOUBTS for future performance of mainly a performance enhancing feature-set!

For Radeon users, there is this big question: "DO I GO SM3 OR DO I NOT? IS IT TOO EARLY FOR ADVANCED TECHNOLOGY???"

Geforce6 users dont have to question anything, we can match or beat any Radeon product in performance and have SM3, a part of DX9C, regardless if its Microsofts biggest failure EVAH (according to you).

This is mainly a thread on SM3, which turned into a rationalization for ATI-fanboys to downplay SM3/DX9C (as usual) with their best efforts possible.. which even reduced again and again into saving $50 on Radeon instead of going with the latest standards.

Don't have that beef lo mein tonight and skip the latest McBain action flic, and you too shall have Shader Model 3! :D
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: housecat
Don't be a baby.

Creig, we arent the ones downplaying anything. We're saying SM3 is something that everyone should be "going with" today. No reason NOT too.
Besides the constant DOUBTS for future performance of mainly a performance enhancing feature-set!

For Radeon users, there is this big question: "DO I GO SM3 OR DO I NOT? IS IT TOO EARLY FOR ADVANCED TECHNOLOGY???"

Geforce6 users dont have to question anything, we can match or beat any Radeon product in performance and have SM3, a part of DX9C, regardless if its Microsofts biggest failure EVAH (according to you).

This is mainly a thread on SM3, which turned into a rationalization for ATI-fanboys to downplay SM3/DX9C (as usual) with their best efforts possible.. which even reduced again and again into saving $50 on Radeon instead of going with the latest standards.



Grow up.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Creig
Originally posted by: Rollo
For most people, this argument is irrelevant.

1. Most people still own hardware that's too old/slow to wonder about SM2b vs SM3, or soft shadows and HDR.

2. For the people who actually bought X800s it's sort of a moot point. You don't get to see this stuff, arguing about whether you need it is only rationalization of your purchase.

3. For those who bought nV40s (preferably SLI), you get to see all the games as the developers hoped, so all you have to discuss is whether you personally prefer the features on or off. In any case, you're set.

4. For those who bought nV40s (especially SLI), arguing about whether you need it is only rationalization of your purchase.


Creig- you know me ol' buddy! I don't need to rationalize my purchases, my wanting to make them has always been reason enough for me?


 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Megatomic
Originally posted by: Rollo
2. For the people who actually bought X800s it's sort of a moot point. You don't get to see this stuff, arguing about whether you need it is only rationalization of your purchase.

Originally posted by: Creig
4. For those who bought nV40s (especially SLI), arguing about whether you need it is only rationalization of your purchase.

I'm seeing both sides of the coin here fellas. I'm not sure that there can be anything added to this "conversation" that would be constructive after this.

QFT

The only people who should care about this are the people who are shopping for a card.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Rollo
Creig- you know me ol' buddy! I don't need to rationalize my purchases, my wanting to make them has always been reason enough for me?

Good for you. So what makes you think that anybody who purchased an X800 is any different?

 

doublejbass

Banned
May 30, 2004
258
0
0
Originally posted by: housecat


Don't be a baby.

This coming from the bastion of maturity.

Creig, we arent the ones downplaying anything. We're saying SM3 is something that everyone should be "going with" today. No reason NOT too.
Besides the constant DOUBTS for future performance of mainly a performance enhancing feature-set!

How about COST, moron?

For Radeon users, there is this big question: "DO I GO SM3 OR DO I NOT? IS IT TOO EARLY FOR ADVANCED TECHNOLOGY???"

How about "How much will this extra advanced technology do for me, and how much is it going to cost me?"

Geforce6 users dont have to question anything, we can match or beat any Radeon product in performance

Can't the 6800U only beat the X850XT/PE if it's in SLI? And isn't it then costing almost twice as much? (Lowest PCI-E 6800U on Pricegrabber = 498.00) 498.00 x 2 + SLI-worthy PSU extra cost + SLI mobo extra cost + added noise vs. Lowest PCI-E x850 XT/PE = $499.00. WE WIN AT MORE THAN TWICE THE COST! HOORAY!

This is mainly a thread on SM3, which turned into a rationalization for ATI-fanboys to downplay SM3/DX9C (as usual) with their best efforts possible.. which even reduced again and again into saving $50 on Radeon instead of going with the latest standards.

Once again, you're full of shit. Lowest PCI-E X800XL on Pricegrabber = $250. (Connect3D) Lowest PCI-E 6800GT on Pricegrabber = $340. (XFX) So, no, not $50, $90. Now, since XFX is a garbage brand, let's say you want the best workmanship with an excellent warranty, so, BFG and BBATI. Lowest BBATI PCI-E X800XL = $299, lowest PCI-E BFG 6800GT = $425. Now we're talking $126.

Don't have that beef lo mein tonight and skip the latest McBain action flic, and you too shall have Shader Model 3! :D

$50 on lo mein and a movie? (Let alone $90 or $125) It's very simple, we make VALUE JUDGEMENTS, and if the potential benefits of SM3 are worth $50, $90, or $125 to you, then by all means, enjoy. To me, there's no way in hell, for the same reason I use an X800XL instead of an X850 XT/PE.
 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
OT

How come people don't get blasted for buying X850XT PE's instead of regular X850XT's? Or 6800U's instead of GT's? For example the X850XT PE PCIe costs over $100 more than the regular X850XT and what do you get - 20Mhz on the GPU and 50Mhz on the mem and nothing tangible in games. Nearly every GT is guaranteed to overclock to an Ultra.

In fact, SLI has better bang for the buck than both the X850XT PE and the 6800U. So I think owners of those cards should receive a lot more criticism than SLI owners. Regarding cost, if somebody is able to blow $800 on two graphics cards to get the best performance possible, how is that anybody else's business. If you don't pay their rent or buy their food just shut up and be jealous ;)
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Gstanfor
Anti-Aliasing and Anisotropic filtering are strictly optional features. They are not required in order for 3D applications to run. They serve only to enhance IQ.

This is part of the reason why developers are currently enabling SM3.0 in games despite there being no Anti-Aliasing - developers want features andprogrammability, not IQ enhancement.
HDR ad soft shadows seem like an optional feature to me, at this point. I didn't realize FC, SC:CT, and Riddick were unplayable without them.

We're talking about games in a video forum, Greg. Basically every discussion here is about IQ.

Originally posted by: Reverend
Originally posted by: PeteDoom 3 was based around the capabilities of a GeForce. The GF came out early 2000. D3 came out late 2004. It offers similar IQ on a GF as on a GF6,
Really (i.e. similar IQ) ?

The calculations in ARB_fragment_program are all done in floating point, and general dependent texture reads are used.
Not even similar? The linked HOCP GF4MX pics don't look that bad. I couldn't find a good screenshot comparison with a few quick searches.

Ah, here we go. Not much of a difference (in a static screenshot, granted), though no bad guys casting shadows to be found.

Um, right. big difference. GG, memory. But this is dependent on speed more than capabilities, right?
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Originally posted by: Creig
Originally posted by: hans030390
Guys dont start quoting me on what i heard, i was just saying something. I dont find it hard to believe if the UE3 engine uses only SM3...otherwise, SM2.0 (if put in) would either 1. look "bad" but still run fast, or 2.Try to look like SM3 and run slow.

Where are you getting that it would look "bad" from? Does Half-Life 2 look bad simply because it doesn't use SM3.0? No. Does Far Cry look "bad" using SM2.0? No. As as been said repeatedly, SM3.0 isn't really about added eyecandy but about ease of programming and increased performance. I don't know why people keep saying SM3.0 will necessarily look better than SM2.0.

From everything I've read, the biggest advantage of SM3.0 is that it allows looping and branching of shader programs while SM2.0 doesn't. SM2.0 can do nearly everything SM3.0 can do, it's just that longer shader instructions may need to be broken up for SM2.0 to run them without incuring a performance hit. But it can still be done with SM2.0 so there's no reason SM2.0 can't looks just as nice as SM3.0.

Barring "soft shadows", of course. :roll: Reminds me of "shiny pipes".

HAHAHAH, that whole " I dont know why people keep saying SM3.0 will necessarily look better than SM2.0" thing is a classic. You obviously know little about SM3...have you heard of a thing called Displacement mapping? it's ONLY for Sm3.0...read up on it

Most of you need to read up on SM3.0...your posts are idiotic and contain no logic behind your posts...Some do however, so i do not mean that all of you do that. Next time ill just put out links, and if people dont get it then, that's pretty bad

and as for UE3 running in SM2.0, I dont doubt that it will, but there are many features that are used in UE3 that can only be used from SM3.0, so SM2.0 will look worse...Ok, so it MIGHT look like HL2/far cry, which isn't bad, but WHY would you want to play next gen games with last gen graphics? I wouldnt want to play HL2 if it looked like a game from 2000.

SM3.0 is more than just "a slow car with a new engine". It is literally a slow car with that new engine, and also some very nice body kits on it.

Well i'm off to go do something worth my time.

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Pete
Originally posted by: Gstanfor
Anti-Aliasing and Anisotropic filtering are strictly optional features. They are not required in order for 3D applications to run. They serve only to enhance IQ.

This is part of the reason why developers are currently enabling SM3.0 in games despite there being no Anti-Aliasing - developers want features andprogrammability, not IQ enhancement.
HDR ad soft shadows seem like an optional feature to me, at this point. I didn't realize FC, SC:CT, and Riddick were unplayable without them.

We're talking about games in a video forum, Greg. Basically every discussion here is about IQ.

Originally posted by: Reverend
Originally posted by: PeteDoom 3 was based around the capabilities of a GeForce. The GF came out early 2000. D3 came out late 2004. It offers similar IQ on a GF as on a GF6,
Really (i.e. similar IQ) ?

The calculations in ARB_fragment_program are all done in floating point, and general dependent texture reads are used.
Not even similar? The linked HOCP GF4MX pics don't look that bad. I couldn't find a good screenshot comparison with a few quick searches.

Ah, here we go. Not much of a difference (in a static screenshot, granted), though no bad guys casting shadows to be found.

Um, right. big difference. GG, memory. But this is dependent on speed more than capabilities, right?

Errr, what the hell are you on about??? What do HDR & Soft Shadows have to do with AA & AF???? Have you checked out the soft shadowing demo link I provided? No AA & AF there...

Developers currently see both of the above features as desirable features that they want to exploit in their games.
 

user1234

Banned
Jul 11, 2004
2,428
0
0
Originally posted by: Creig
Originally posted by: Rollo
Creig- you know me ol' buddy! I don't need to rationalize my purchases, my wanting to make them has always been reason enough for me?

Good for you. So what makes you think that anybody who purchased an X800 is any different?


LOL

pwned
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Anti-Aliasing and Anisotropic filtering are strictly optional features
I'd consider 16xAF as standard since even the original Radeon basically did it for free. The concept of running a 3D game without AF simply doesn't exist for me.

As for AA, high resolution does reduce its need somewhat but there's still a big difference between the likes of 1920x1440 with no AA and 4xAA.

This is part of the reason why developers are currently enabling SM3.0 in games despite there being no Anti-Aliasing - developers want features andprogrammability, not IQ enhancement.
But isn't your whole reason for pointing out HDR because it does exactly that - increases IQ? It certainly doesn't increase performance.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
I'm not reading all the arguing back and forth, but here's my viewpoint that you can do with as you please:

The argument at this point should not be is SM3.0 the way to go or not. We have seen that developers are developing for SM3.0. Eventually SM3.0 will be in use as SM2.0 is in fairly extensive use now.

I think the argument should be something more along the lines of:
When games are actually using significant SM3.0 will any of the hardware we have now matter?

I'm thinking back to SM2.0 and DX9. nVidia had the FX series that performed fine on initial DX9 titles that had some, but not extensive, use of DX9 features. Games progressively added more DX9, and the FX series ultimately showed to have inadequate at DX9 performance as we've all seen in the current generation games.

Are they doing the same thing with SM3.0?

I don't know. But from what I'm seeing, I think it may be so. I think nVidia likes to have a long list of features they can advertise, and SM3.0 may be a fairly superficial implementation in the 6xxx series, much as DX9 was a fairly superficial implementation in the 5xxx series (especially the 5200, that card was a total joke). Note my use of the word 'may'. I am speculating and not accusing.

The argument only exists because nVidia cards are going for a significant premium over ATi cards that lack the feature. If the 6800GTs were to come down in price to where the x800XL is, the point would be moot, because everyone would be buying 6800GTs. They arent, and we can only speculate as to why the price is staying high. Is it because nVidia sees SM3.0 as a premium feture worth a premium price? Possibly. It could also be that their cards simply cost more to produce and that ATi really does get a significant price advantage from the smaller die of the x800XL.

My feeling is that there is enough doubt involved that the CURRENT SM3.0 cards available are not worth considering exclusively because of their SM3.0 support. Considering, yes. For performance and value, but exclusively because of SM3.0 support, I don't think so. I can't trust nVidia that much after getting burned on the 5xxx series. If you're buying PCI-e you pay a VERY steep premium for SM3.0 support given the performance of the x800XL. I'm willing to bet that when the time comes when SM3.0 is REALLY in primetime, you'll be able to sell off an x800XL for a reasonable price, and the difference in used prices between an x800XL and 6800GT will be less than the current difference between an x800XL and 6800GT is today, so you would come out ahead anyway.

AGP is another animal. The 6800 is cheaper for AGP and the x800XL is more expensive. The gap is narrower and the decision is more of a toss up.

I am considering upgrading to a Venice/939, and if I do I will be going PCI-e. To me the PCI-e card that makes the most sense to me is the x800XL. If I were on a significant budget I'd get the 6600 (non-GT). I don't favor brand, only value.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
hans and Greg, please read (and preferably understand) my posts before calling people idiots or clueless. With that, I'm done posting in this thread, as it's just not worth arguing with (IMO) illogical sophists. I'm just going to stop in to see if Rev got back to me re: Doom 3 (am I slightly or totally wrong?).
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: BFG10K
Anti-Aliasing and Anisotropic filtering are strictly optional features
I'd consider 16xAF as standard since even the original Radeon basically did it for free. The concept of running a 3D game without AF simply doesn't exist for me.

As for AA, high resolution does reduce its need somewhat but there's still a big difference between the likes of 1920x1440 with no AA and 4xAA.

This is part of the reason why developers are currently enabling SM3.0 in games despite there being no Anti-Aliasing - developers want features andprogrammability, not IQ enhancement.
But isn't your whole reason for pointing out HDR because it does exactly that - increases IQ? It certainly doesn't increase performance.

It is a developer chosen enhancement, chosen because it has a specific effect the developer wanted to acheive. You can't simply flip a driver switch and enable HDR for every game. It's not even desirable for a lot of games and needs to be used intelligently where it is employed or it ends up not looking correct, such as indoors in farcry.

Traditionally, developers don't care that much about performance - they generally leave it up to users to purchase a card powerful enough to run the game with the IQ desired. Developers only care that cards have the required features to properly run the game.