The Ultimate SM3.0 Game Thread

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: BFG10K
Don't think ATI made the SM 2.0b to compete with SM 3.0 in Fary Cry.
It was already in the hardware on R4xx yet there was no mention of it until FC's SM 3.0 path came along. It's strange that ATi would "forget" something as significant as that, don't you think?
Correct me if I'm wrong, but wasn't the "hype engine" for HL2 well under way, before FarCry came out of nearly no-where and was released? Wasn't there some discussion already that HL2 would "perform better" on ATI cards, and "take advantage of ATI's special features"?

As far as their SM2.0b additional support over and above the baseline SM2.0 stuff - what all is there? The issue of allowing more shader opcodes/registers above the baseline MS SM2.0 implimentation requirements, was well described in ATI's developer docs. F-Buffer stuff... well, I haven't seen any real demos that use it yet, so file that feature under "vaporware" or the "land of the lost GPU features" for now. (Ok, Ben, I'm admitting it.)

Geometry-instancing, is more of a driver-level feature AFAIK, and once there actually was a game on the market that took advantage of that feature, ATI enabled that feature in their drivers and gave CryTek a way to use it. The reason that they can't "advertise" support for that feaure, is due to MS's specs on driver capability enumeration - there simply isn't a way to report it using "standard channels" under SM2.0. So ATI can't market it as a SM2.0 feature, not can their drivers enumerate themselves as supporting the SM3.0 spec, as I understand it.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: BFG10K
I've done so. Despite your flawed arguments I think people understand that SM 3.0 doesn't require extra IQ or to eclipse the Radeon's performance to be useful. Free potential loss of performance is always a good thing.
Fixed. :)
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: munky
I'm drawing a conclusion that we (or I) dont know enough to make an informed decision. People who bought current gen nvidia cards must think it's important to have sm3, but how the cards will run sm3 code in future games is yet to be determined.
Finally! Some sense!

That's part of the problem I think, we simply don't have enough real-world data on SM3.0 usage (and thus performance) yet, and so, fill that void with opinions, and in this forum, that seems to be mostly fueled by brand preferences.

Originally posted by: munky
I personally don't have an sm3 card, and I'm not worried about it because sm3 is not important to me. (I'm too busy playing GT4 anyways, lol).
Again, amazing, some sense! Sure you didn't fall into this forum by mistake? :)
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: apoppin
The only think i can say to ease your concerns, VirtualLarry, is to say SM 3.0 will be implemented gradually just as 2.0 was and 1.4 before it. Game designers know better than to bring out a game with gfx advanced that no one can play. ;)

But see, that's the thing. The real, true value of the SM3.0 spec, is that it is presented as the sort of "new universal baseline standard". The idea is that coding for SM3.0 makes the dev's lives easier and allow them to get their jobs done faster. If the devs still have to pander to prior specs, and the whole ATI=24 bits of FP precision and NV=16/32 bits thing, then nearly the entire value of SM3.0 (to the devs) is lost. Remember, there's nothing (AFAIK) inherent in SM3.0 that cannot be done in SM2.0, in terms of IQ or eventual final effect. Just a (potentially) faster or easier way to get there for the devs. IOW, full, true, SM3.0 support, is nearly a sort of "SM3.0 or bust" thing. Now, I don't see game devs ignoring the entire installed-base of hardware out there, at least for the near-future crop of games, so I think that they will adopt a "SM2.0++" usage, essentially, SM2, but possibly using branching shader code (SM3.0 required) in certain places, when they can do things with it like collapse multiple rendering passes. That would seem to be the most prudent and smartest thing to do. But that also implies that those games will not be using "SM3.0 from the ground up", but rather just SM2.0 with some uses of SM3.0 thrown in. IOW, they wouldn't really be considered "true" SM3.0-using games. Once that happens, I feel that they will use SM3.0 as the baseline standard, and not look back. I see that as happening in around 1 - 1.5 years, possibly slightly sooner, depending on how fast the installed-base upgrades. The next-gen mid-range parts from both ATI and NV, and their availability and pricing should be a big factor in the game devs' directions.


Originally posted by: apoppin
SM 3.0 is supposed to make shaders more efficient, not less efficient. Games are typically 1-1/2 years -3 years behind development. We just got SM 3.0 in the 6800 series . . . . we see that we already got some impressive number of titles and the card is only ONE year old . . . . now ADD ati's support for SM 3.0 and i'd say 'case closed' for 3.0 being quickly adopted.
Most of the "efficiency" of SM3.0, is in terms of developers, not in terms of the low-level hardware execution. It makes it easier for the devs to write the code in the first place. They just leave it up to the hardware guys to "make it faster" in many cases, and that will likely require newer hardware than is currently out. The biggest, IMHO, advantage to ATI adopting SM3.0, is: 1) the devs no longer have to worry about the 16/24/32-bit FP thing, just code for 32-bit FP on all hardware, and 2) secondarily, ATI can "properly" expose their Geometry-instancing support in their DirectX drivers now.

Originally posted by: apoppin
the final 'kicker' to my case is that it does not "hurt" if your "gamble" is wrong and you have a slow or useless SM 3.0 feature . . . it does not affect the rest of the videocard iany way.
For the most part, that's true, speaking of performance only... but you do pay extra up-front for the additional cost, so the gamble isn't entirely "free". It's kind of like buying a S939 mobo, instead of a S745, because of rumors that AMD would introduce dual-core S939 CPUs in the future that would run on them. It could well be, that to get "proper" performance from a dual-core chip, you would need to move to DDR2 memory or something. So while it would "run" on your existing board, it wouldn't deliver the level of performance that one would ordinarily expect. I hypothesize that current existing SM3.0-capable hardware is much the same way.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Bar81
Because many of you obviously failed english I'm going to go over this AGAIN:


This thread is simply trying to determine whether SM3.0 is a factor that should be legitimately considered when making a purchasing decision on this generation of cards. To wit, does having SM3.0 confer any sort of tangible advantage such that one could say that having a SM3.0 card is a "must" over a card without the feature. To determine that I am gathering a list of SM3.0 enabled games currently available and forthcoming in the near future to allow people to answer the question on their own.

For those with agendas. Two cards, same price, same performance, one is SM2.0b, the other is SM3.0. Whether you know anything about SM2.0b and SM3.0 or not, why in the land of F**k would you buy the SM2.0b card?

Thats like having two sport utility vehicles in front of you and you need to purchase one of them. They are identical for the most part except one has 2-wheel drive and the other has all-wheel drive. You tell the car salesman that you will take the two wheel drive suv because its not snowing TODAY. The snow will fall eventually Bar.

This is so simplistic it's childs play.

 

housecat

Banned
Oct 20, 2004
1,426
0
0
thats exactly what i said in the other thread keys.. but i got dumb stares from the crowd.

when you say something as logical, and purely true as that.. the agenda carrying ones scatter.




btw did ppl read this entire article? link seriously, it doesnt say it supports SM3 yet. and i'm a huge CSS fan, I read every single update and there has been no SM3 update.
sometimes i wonder if people who choose to ARGUE over this even READ!
it says that they will likely support SM3 in Source when ATI gets SM3 hardware out.

im sorry, but if you read that entire article- I would feel entirely duped by ATI to be sit here and defend a CORPORATION, a company(!!!!) that offers LESS today in the X800s.. that many bought due to "corporate alligance" and not the facts, yet will find a nice surprise update to "ATI's game" HL2 when the R520 is out THIS YEAR.
but the same people are buying into this anti advanced technology arguement that ATI has fed them.

i feel sorry for them honestly.
my SLI 6800GTs will be gaining a boost in HL2, if they dont upgrade to their favorite companys newest card.. they wont experience it.
its just blatant fanboyism to not admit this. its ignorance (bar81s favorite word he likes to toss around)
at least your favorite company will have you covered.. umm.... "when you need it"
you guys were obviously not around when ATI had PS1.4, we heard the same thing from the NV crowd.. "you dont need it yet"

Total and utter BS.
Now, one of the biggest and best game engines (one that actually matters) is getting SM3.. when ATI has it.
Well, enjoy buying a new video card when you couldve paid the same, got the same performance, partial or full PVP in your current card in your rig right now.

of course ppl like the OP will somehow, somewhere find it in his heart to defend a corporation who ultimately got beaten to SM3/DX9C,
whatever. Enjoy. :thumbsup:
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
It's the same argument that there was back when the 9700 Pro showed up, only now the shoe is on the other foot. ATi fans screamed about the 9700 Pro having full DX9 support a full year (or more) before any DX9 games were on the market. The nVidia fans screamed that it was useless because no games used DX9 yet, and when the do, a 9700 Pro wouldn't have the balls to run them. Then a few months after DX9 games started showing, a new generation of video cards came out to run all the DX9 games a hell of a lot better than the 9700 Pro. Now SM 3.0 is supported, but there aren't any games that take advantage of what is in my opinion, the biggest feature of SM3.0, displacement mapping. The ATi fans will say the 6800's aren't fast enough to run a game with AA and AF and displacement mapping and HDR... and the nVidia fans will say at least they have the option.

So who's right? Nobody. Both "sides" bitch about things equally... both "sides" try to belittle the other by either touting or condemning unused features.

What's the solution? Shut up. :D
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Well, to add further to the issue at hand, here's an interesting quote from nvidia's financial report regarding the upcoming nvidia cards:

?Well, from an architecture standpoint we?re just still at the beginning of shader model 3.0. And we need to give the programmers out there some time to continue to really learn about that architecture. So in the spring refresh what you?ll see is a little bit faster versions...

...I think you?ll see the industry move up a little bit in performance. But I don?t think you?ll see any radical changes in architecture. I doubt you?ll see any radical changes in architecture even in the fall. When we came out with GeForce 6, we tend to create a revolutionary architecture about every actually two years. And then we derive from it for the following time. So even the devices that we announced this fall, that will be I think a lot more powerful than the ones we actually had a year ago. Architecturally we?re still in the shader model three type era.?

http://www.beyond3d.com/#news20937

I think this could be potentially devestating news for SM3.0 on current cards. If nvidia is making what amounts to a SM3.0B spec for their next generation cards what does that tell us about today's implementation of SM3.0 and it's usefullness in upcoming games played with a 6800 series card?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I think this could be potentially devestating news for SM3.0 on current cards. If nvidia is making what amounts to a SM3.0B spec for their next generation cards what does that tell us about today's implementation of SM3.0 and it's usefullness in upcoming games played with a 6800 series card?

Do you have a single properly functioning brain cell? I would try to explain what was stated in what you quoted, but it is in laymens terms already and I don't speak any more basic language.

I'm one of the guys backing the SM 3.0 isn't a big deal sentiment(although I doubt you were saying that back when the R9700Pro launched as I was about SM 2.0) but your last post implies you are borderline 'differently abled'.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Humor me, and elaborate. If what you are saying is that you interpret the statement to mean that there will be faster implementations by developers of SM3.0 then I think you're reading it incorrectly.

And second, how do you know what I was saying back in the 9700Pro days? Stop talking out of your *ss.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
I think this could be potentially devestating news for SM3.0 on current cards. If nvidia is making what amounts to a SM3.0B spec for their next generation cards what does that tell us about today's implementation of SM3.0 and it's usefullness in upcoming games played with a 6800 series card?

Only one thing is certain- owners of current SM3 hardware will be in a better position than owners of SM2 hardware. (ATI)
;)




 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Noob
Another reason people will buy an X800 over a 6800:

http://www.xbitlabs.com/articles/video/display/half-life.html

The 6800 Ultra can't even outperform aan X800 Pro in teh HL2 benchamrks

OMFG! You mean to tell me the one game ATI actually had some involvement with runs better on ATI cards?!?!?!?!
:roll:

LOL- OK, but the X800Pro got owned at everything else by 6800GTs, so life goes on.

Here's some breaking news for you Noob!
X800 beaten by 6800NU at Doom3!
:Q

LOL
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: Rollo
I think this could be potentially devestating news for SM3.0 on current cards. If nvidia is making what amounts to a SM3.0B spec for their next generation cards what does that tell us about today's implementation of SM3.0 and it's usefullness in upcoming games played with a 6800 series card?

Only one thing is certain- owners of current SM3 hardware will be in a better position than owners of SM2 hardware. (ATI)
;)


True, but if by analogy the X800 users have no legs and the 6800 users have one leg, I fail to see how any of them is going to compete in the LA marathon.
 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
Originally posted by: keysplayr2003
Originally posted by: Bar81
Because many of you obviously failed english I'm going to go over this AGAIN:


This thread is simply trying to determine whether SM3.0 is a factor that should be legitimately considered when making a purchasing decision on this generation of cards. To wit, does having SM3.0 confer any sort of tangible advantage such that one could say that having a SM3.0 card is a "must" over a card without the feature. To determine that I am gathering a list of SM3.0 enabled games currently available and forthcoming in the near future to allow people to answer the question on their own.

For those with agendas. Two cards, same price, same performance, one is SM2.0b, the other is SM3.0. Whether you know anything about SM2.0b and SM3.0 or not, why in the land of F**k would you buy the SM2.0b card?

Thats like having two sport utility vehicles in front of you and you need to purchase one of them. They are identical for the most part except one has 2-wheel drive and the other has all-wheel drive. You tell the car salesman that you will take the two wheel drive suv because its not snowing TODAY. The snow will fall eventually Bar.

This is so simplistic it's childs play.

Because people know that the X800 woops the 6800 in performacne and image quality. And they also know SM 3.0 doesn't make an image quality difference, and nor is there the power behind the cards to take advantage of it. So it's not so noobishly simple as you state it.
 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
Originally posted by: Rollo
Originally posted by: Noob
Another reason people will buy an X800 over a 6800:

http://www.xbitlabs.com/articles/video/display/half-life.html

The 6800 Ultra can't even outperform aan X800 Pro in teh HL2 benchamrks

OMFG! You mean to tell me the one game ATI actually had some involvement with runs better on ATI cards?!?!?!?!
:roll:

LOL- OK, but the X800Pro got owned at everything else by 6800GTs, so life goes on.

Here's some breaking news for you Noob!
X800 beaten by 6800NU at Doom3!
:Q

LOL

That's Doom 3 noob. Plus more recent benchmarks are showing that ATI's new drivers are closing the gap. Those benchmakrs were showing the 6800 Ultra Extreme whcih was announced a long time ago that they cancelled it. Not to mention ATI has better AA and AF in performance and image quality. So get your facts straight as to which card is better noob. All the major PC Mags and PC websties already show that the X800 is better. So don't debate that. Plus the X800's run better on Far Cry (a game designed to run on 6800's). See for yourself:

http://www.pcstats.com/articleview.cfm?articleid=1578&page=8

And that was just the X800 XT. The X800 XT PE is 10% better. And there is no doubt that the X800's perform better on Doom 3 then the 6800's do on HL2.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Noob
Originally posted by: Rollo
Originally posted by: Noob
Another reason people will buy an X800 over a 6800:

http://www.xbitlabs.com/articles/video/display/half-life.html

The 6800 Ultra can't even outperform aan X800 Pro in teh HL2 benchamrks

OMFG! You mean to tell me the one game ATI actually had some involvement with runs better on ATI cards?!?!?!?!
:roll:

LOL- OK, but the X800Pro got owned at everything else by 6800GTs, so life goes on.

Here's some breaking news for you Noob!
X800 beaten by 6800NU at Doom3!
:Q

LOL

That's Doom 3 noob. Plus more recent benchmarks are showing that ATI's new drivers are closing the gap. Those benchmakrs were showing the 6800 Ultra Extreme whcih was announced a long time ago that they cancelled it. Not to mention ATI has better AA and AF in performance and image quality. So get your facts straight as to which card is better noob. All the major PC Mags and PC websties already show that the X800 is better. So don't debate that. Plus the X800's run better on Far Cry (a game designed to run on 6800's). See for yourself:

http://www.pcstats.com/articleview.cfm?articleid=1578&page=8

And that was just the X800 XT. The X800 XT PE is 10% better. And there is no doubt that the X800's perform better on Doom 3 then the 6800's do on HL2.


You don't need to tell me about X800XT PEs Noob. Unlike you, I actually own one.

For a little while longer anyway, it was in my five year old's box, and he mostly games in 256 colors on his "learn to read" and "learn math" games.
(so I sold it to Keysplayr2003!)

BTW- as "reknowned" as PCSTATs is :roll:, here are some very recent Far Cry benches on Firing Squad:
http://www.firingsquad.com/hardware/far_cry_1.3/page9.asp
12X10 4X16X Research
6800GT=60fps X800Pro=54fps D'oh.

http://www.firingsquad.com/hardware/far_cry_1.3/page12.asp
12X10 4X16X Regulator
6800GT=50fps X800Pro=42fps D'oh!!

http://www.firingsquad.com/hardware/far_cry_1.3/page15.asp
12X10 4X16X Training
6800GT=56fps X800Pro= 49fps Na na na na. Na na na na.

http://www.firingsquad.com/hardware/far_cry_1.3/page18.asp
6800GT=59fps X800Pro=52fps Hey heeeeyyyy...Goodbye.

 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
You are comparing a 12 piper to a 16 piper. So of course it will run better. And if you own an X800 XT PE then why did you go the SLI route?
 
Jun 14, 2003
10,442
0
0
Originally posted by: Noob
You are comparing a 12 piper to a 16 piper. So of course it will run better. And if you own an X800 XT PE then why did you go the SLI route?


because rollo loves his technology...i dont think i know of anyone else who has owned and tested (and supplied benchies to AT forums) as many cards as he has. he has more right to comment on different cards performance, as chances are hes actually owned it at some point.

and hes not comparing a 12 to a 16, im sure he knows that, hes comparing similarly priced cards

last i looked the x800pro was in the same price bracket as the 6800GT
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Noob
You are comparing a 12 piper to a 16 piper. So of course it will run better.
I compared the 6800GT to the card you own because you said "X800s run Far Cry better".
So I pointed out not all of them do, just some of the newer ones.

And if you own an X800 XT PE then why does your sig say you have 2 6800's.
I have two computers, the one in my signature that I primarily use, and a A64 3000+/MSI K8TNEO/1GB Corsair PC3200/Asus X800XT PE that my five year old primarily uses.

Actually I currently own two PCIE 6800NU SLI, two PCIE 6600GT SLI, one AGP 6800NU, one AGP X800XT PE. My five year old will have to "make do" with the 6800NU, the X800XT PE was being underutilized at "Reader Rabbit" and "Sponge Bob Square Pants", so I sold it to Keysplayr2003 cheap because he wanted to try a X800.

 
Jun 14, 2003
10,442
0
0
Originally posted by: Rollo
Originally posted by: Noob
You are comparing a 12 piper to a 16 piper. So of course it will run better.
I compared the 6800GT to the card you own because you said "X800s run Far Cry better".
So I pointed out not all of them do, just some of the newer ones.

And if you own an X800 XT PE then why does your sig say you have 2 6800's.
I have two computers, the one in my signature that I primarily use, and a A64 3000+/MSI K8TNEO/1GB Corsair PC3200/Asus X800XT PE that my five year old primarily uses.

Actually I currently own two PCIE 6800NU SLI, two PCIE 6600GT SLI, one AGP 6800NU, one AGP X800XT PE. My five year old will have to "make do" with the 6800NU, the X800XT PE was being underutilized at "Reader Rabbit" and "Sponge Bob Square Pants", so I sold it to Keysplayr2003 cheap because he wanted to try a X800.


haha that has to be the worst use of a x800xt pe ever lol, i bet reader rabbit ran like a absolute champ!
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: otispunkmeyer
Originally posted by: Rollo
Originally posted by: Noob
You are comparing a 12 piper to a 16 piper. So of course it will run better.
I compared the 6800GT to the card you own because you said "X800s run Far Cry better".
So I pointed out not all of them do, just some of the newer ones.

And if you own an X800 XT PE then why does your sig say you have 2 6800's.
I have two computers, the one in my signature that I primarily use, and a A64 3000+/MSI K8TNEO/1GB Corsair PC3200/Asus X800XT PE that my five year old primarily uses.

Actually I currently own two PCIE 6800NU SLI, two PCIE 6600GT SLI, one AGP 6800NU, one AGP X800XT PE. My five year old will have to "make do" with the 6800NU, the X800XT PE was being underutilized at "Reader Rabbit" and "Sponge Bob Square Pants", so I sold it to Keysplayr2003 cheap because he wanted to try a X800.


haha that has to be the worst use of a x800xt pe ever lol, i bet reader rabbit ran like a absolute champ!

:)
Oh yeah Otis. You know that part where there 2d train goes by and Reader Rabbit has to catch the falling letters?
There were no hesitations or anomalies at all!

I thought about OCing the XT PE when I saw they had a dart gun game in "Veggie Tales Carnival" because I had heard those "first person shooter" games require a LOT of video card power, but I found the darts flew pretty smooth at the game's 640X480X256 native res, so I didn't risk it. ;)

I really wanted to keep that X800XT PE, as it's an Asus and has a Silencer mounted on it and is a pretty damn cool card, but PCIE is the way forward and the computer it's in is in the spare bedroom between mine and my son's. (i.e. no gaming in there after 8PM)

Keys will give it a good home. ;) (and an unlockable 6800NU won't inconvenience my 5 year old any)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: VirtualLarry
Originally posted by: apoppin
The only think i can say to ease your concerns, VirtualLarry, is to say SM 3.0 will be implemented gradually just as 2.0 was and 1.4 before it. Game designers know better than to bring out a game with gfx advanced that no one can play. ;)

But see, that's the thing. The real, true value of the SM3.0 spec, is that it is presented as the sort of "new universal baseline standard". The idea is that coding for SM3.0 makes the dev's lives easier and allow them to get their jobs done faster. If the devs still have to pander to prior specs, and the whole ATI=24 bits of FP precision and NV=16/32 bits thing, then nearly the entire value of SM3.0 (to the devs) is lost. Remember, there's nothing (AFAIK) inherent in SM3.0 that cannot be done in SM2.0, in terms of IQ or eventual final effect. Just a (potentially) faster or easier way to get there for the devs. IOW, full, true, SM3.0 support, is nearly a sort of "SM3.0 or bust" thing. Now, I don't see game devs ignoring the entire installed-base of hardware out there, at least for the near-future crop of games, so I think that they will adopt a "SM2.0++" usage, essentially, SM2, but possibly using branching shader code (SM3.0 required) in certain places, when they can do things with it like collapse multiple rendering passes. That would seem to be the most prudent and smartest thing to do. But that also implies that those games will not be using "SM3.0 from the ground up", but rather just SM2.0 with some uses of SM3.0 thrown in. IOW, they wouldn't really be considered "true" SM3.0-using games. Once that happens, I feel that they will use SM3.0 as the baseline standard, and not look back. I see that as happening in around 1 - 1.5 years, possibly slightly sooner, depending on how fast the installed-base upgrades. The next-gen mid-range parts from both ATI and NV, and their availability and pricing should be a big factor in the game devs' directions.


Originally posted by: apoppin
SM 3.0 is supposed to make shaders more efficient, not less efficient. Games are typically 1-1/2 years -3 years behind development. We just got SM 3.0 in the 6800 series . . . . we see that we already got some impressive number of titles and the card is only ONE year old . . . . now ADD ati's support for SM 3.0 and i'd say 'case closed' for 3.0 being quickly adopted.
Most of the "efficiency" of SM3.0, is in terms of developers, not in terms of the low-level hardware execution. It makes it easier for the devs to write the code in the first place. They just leave it up to the hardware guys to "make it faster" in many cases, and that will likely require newer hardware than is currently out. The biggest, IMHO, advantage to ATI adopting SM3.0, is: 1) the devs no longer have to worry about the 16/24/32-bit FP thing, just code for 32-bit FP on all hardware, and 2) secondarily, ATI can "properly" expose their Geometry-instancing support in their DirectX drivers now.

Originally posted by: apoppin
the final 'kicker' to my case is that it does not "hurt" if your "gamble" is wrong and you have a slow or useless SM 3.0 feature . . . it does not affect the rest of the videocard iany way.
For the most part, that's true, speaking of performance only... but you do pay extra up-front for the additional cost, so the gamble isn't entirely "free". It's kind of like buying a S939 mobo, instead of a S745, because of rumors that AMD would introduce dual-core S939 CPUs in the future that would run on them. It could well be, that to get "proper" performance from a dual-core chip, you would need to move to DDR2 memory or something. So while it would "run" on your existing board, it wouldn't deliver the level of performance that one would ordinarily expect. I hypothesize that current existing SM3.0-capable hardware is much the same way.

good morning .. . i see this is still going but the fanboys are now giving it a go. :p

Well, VL . . . in looking over your reply to me i see we have similar concerns and appear to agree more than disagree but you appear to be overly concerned that SM 3.0 might be poorly coded . . . and your timeline is just a bit longer than mine . . . . ;)

certainly the current 6800 series will not run SM 3.0 as well as the 6900 series . . . but still miles ahead of the x800 series. ;)
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
apoppin, i had posted the list of supported games a LONG time ago, and i even mentioned that HL2 supported Sm3.0 (i think....i know i put up a link to the list though)

i guess no one read it.....but even if the 6800 doesnt do too well in next gen games, its a better choice than an x800 UNLESS you upgrade every year
 
Jun 14, 2003
10,442
0
0
Originally posted by: Rollo
Originally posted by: otispunkmeyer
Originally posted by: Rollo
Originally posted by: Noob
You are comparing a 12 piper to a 16 piper. So of course it will run better.
I compared the 6800GT to the card you own because you said "X800s run Far Cry better".
So I pointed out not all of them do, just some of the newer ones.

And if you own an X800 XT PE then why does your sig say you have 2 6800's.
I have two computers, the one in my signature that I primarily use, and a A64 3000+/MSI K8TNEO/1GB Corsair PC3200/Asus X800XT PE that my five year old primarily uses.

Actually I currently own two PCIE 6800NU SLI, two PCIE 6600GT SLI, one AGP 6800NU, one AGP X800XT PE. My five year old will have to "make do" with the 6800NU, the X800XT PE was being underutilized at "Reader Rabbit" and "Sponge Bob Square Pants", so I sold it to Keysplayr2003 cheap because he wanted to try a X800.


haha that has to be the worst use of a x800xt pe ever lol, i bet reader rabbit ran like a absolute champ!

:)
Oh yeah Otis. You know that part where there 2d train goes by and Reader Rabbit has to catch the falling letters?
There were no hesitations or anomalies at all!

I thought about OCing the XT PE when I saw they had a dart gun game in "Veggie Tales Carnival" because I had heard those "first person shooter" games require a LOT of video card power, but I found the darts flew pretty smooth at the game's 640X480X256 native res, so I didn't risk it. ;)

I really wanted to keep that X800XT PE, as it's an Asus and has a Silencer mounted on it and is a pretty damn cool card, but PCIE is the way forward and the computer it's in is in the spare bedroom between mine and my son's. (i.e. no gaming in there after 8PM)

Keys will give it a good home. ;) (and an unlockable 6800NU won't inconvenience my 5 year old any)


yes very cool card, my friend has himself one too, been hooked on ATi ever since he bought the 9700pro when it came out, hes the type of guy that has to have the best an spends daft mounts to get it, hes going SLI next time he upgrades, but for now his rig is AGP (jumped on the A64 bandwagon when they first came out) and its soo sweet at every game.

640x480x 256 colours!! beautiful!