Why go with SM3.0 today?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Todd33

Diamond Member
Oct 16, 2003
7,842
2
81
SM3.0 is the FSAA of the GF2 era.

1)that obviously hasnt stopped Ubisoft, i will remind you that Splinter cell is SM1.1 (which even gf3's and radeon 8500's can do) or its SM3.0 (which only geforce 6 series can do) in this game ATI owners have nothing special over much older hardware bar pure speed, thats it. its just tumped up old technology.

So one company gets paid by Nvidia and leaves out the standard, 2.0, and that's a win for who? Nvidiots? I guess you have to brag about something, even if it's not much.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Why dont people understand that its NOT just ATi owners that get the shaft. Its the NV 5200-5950 owners that get screwed as well? The 6x series doesnt come NEAR as close to a user base as the FX users do. Cards that can do PS/SM3.0 are in a very, very, very small number, compared to cards than can do PS2.0, for now.

While I am happy they are programming for the future, screwing the past card owners is just bad buis. And before someone comes in and says "upgrade your old 2003 tech", tell me why they supported 2000 tech instead? I have yet to see a reason from Ubisoft about this. My guess (hope) is that they add PS2.0 in a patch later down the road. Not for me, as I dont play the game, but for the many, many users who would benefit from this.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Todd33
SM3.0 is the FSAA of the GF2 era.

1)that obviously hasnt stopped Ubisoft, i will remind you that Splinter cell is SM1.1 (which even gf3's and radeon 8500's can do) or its SM3.0 (which only geforce 6 series can do) in this game ATI owners have nothing special over much older hardware bar pure speed, thats it. its just tumped up old technology.

So one company gets paid by Nvidia and leaves out the standard, 2.0, and that's a win for who? Nvidiots? I guess you have to brag about something, even if it's not much.

This post is typical of misinformed people totally getting the facts surrounding Sm3.0 wrong.

Sm3.0 is NOT "nVidia's model" or "not the standard". It has been the standard since DX9.0c was released, since it is Microsoft's model, which nVidia has implimented and Ati has not.
 

imported_Reverend

Junior Member
Apr 26, 2005
17
0
0
Originally posted by: PeteRev, I'm surprised to see you registered here just to set the local wildlife straight. I thought you were supposed to be busy with other projects? :)

I am. But you know me... there are always those instances where I scold my 6-year-old son's friends. I have my "can't... resist... it... must... reply..." moments.

Anyway, I hadn't looked at the post that started this thread. Here's what it said :

Originally posted by: X
Beyond3D has a nice comparison of SM3.0 with SM1.1 here. What I find interesting is that 3.0 is touted as being more efficient. However, if you enable all of its additional features, you end up being unable to render them at an acceptable frame rate with today's technology.

So why get SM3.0 today when cards aren't fast enough to take advantage of its features? Not to say there aren't other reasons to choose Nvidia over ATI (I've had both and have no particular brand loyalty). I just don't understand why people advocate SM3.0 as an advantage of the 68 series, given that that today's cards can't take advantage of it. Am I missing something?

1) The word "efficient" was mentioned. And it is correct, up to a degeree. It depends on what a programmer seeks. For SC:CT, a couple of examples of what is meant by "efficiency" :

(a) in SM 3 there are less draw calls than in 1.1 because that means more lights can be processed in a single pass (unrestricted by shader instruction count limits). Tha means it's more efficient from both a CPU and GPU point of view (less redundant work).

(b) static branching is used in SM3 to greatly simplify some of the game's shaders. Used shader combis of the SCCT's uber shaders are instantiated during the (what you see on the screen whenever you load a level/savedgame when using the SM3 path) "Caching SM3 shaders" phase to avoid runtime stalls caused by unified compiler runtime compilation.

Do these mean you're gonna get "acceptable framerates"? No, it means exactly what it means -- using SM3 is more efficient than SM1.1. Whether you get "acceptable framerates" is another matter all together. "More efficient" does not mean "acceptable framerates". It means "less sh*tty framerates". There's a difference.

2) "Unable to render at reasonable framerate when all SM3 features are enabled" was mentioned. This is a complaint that deserves a big "DUH!". Every single next-gen hardware on its debut will not render anything at "acceptable framerates" (even if you used the fastest CPU... remember, we're talking TnL-shader-capable hardware here) if many (I'm not even saying all) of its next-gen features are enabled. Oh please, c'mon folks...

3) The important question mentioned by the poster : "So why get SM3.0 today when cards aren't fast enough to take advantage of its features?"

Because the currently available mid-to-highend SM3.0 cards can run "SM2.0 games" competitively with other SM2.0-only offerings. IOW, in "SM2.0 apps/games", what has a SM3.0 card got to lose?

NOTE : Please remember point #2 above. Think logically. Know how developers do their business.

4) Quoted : "I just don't understand why people advocate SM3.0 as an advantage of the 68 series, given that that today's cards can't take advantage of it."

Please read points #1, 2 and 3 above. I'll understand the mistake of using the words "today's cards" instead of "today's games". Again, read points #1, 2 and 3.

If you still don't understand why, at this time, a SM3.0 card makes sense if you're comparing it to a SM2.0 card you already own (I assume the bolded words really are what's important, and not in the general sense of "SM3.0 now or later?"), then I won't offer any recommendations.

No wait, haven't I already done that?

5) Finally, the fact that the B3D "article" is the basis for starting this thread.

Everything (well, I would say 90% of it, excepting the screenies which should originate from B3D) in that "article" is basically a public version of what Ubisoft told B3D via emails. Ubisoft, seeing B3D as a media outlet, probably does not see any harm in "using" B3D to tell the public what SCCT does graphically. The article itself tells it like it is, as informed by Ubisoft to B3D -- all (or 90%) the bullet-point features of the game are provided by Ubisoft, the feature-specific explanations are provided by Ubisoft and other comments in the article are either provided by Ubisoft or promted by coments provided by Ubisoft... the article, basically, could've been writen by Ubisoft... 90% of the the article are facts about the game, as provided by Ubisoft to B3D... I should know.... I did it before for B3D and (no disrespect intended towards Nick) Nick simply isn't capable enough (3D-wise) nor knowledgeable enough about the game (without Ubi's contribution) to write that article without any sort of help. Anyone that says B3D is "biased" for/against any IHV based solely on the content of that article is using their preconceptions of B3D (rightly, wrongly... I don't really give a sh*t) to divert readers of this thread to something that belongs in another forum and not onto the subject matter raised by the starter of this thread.

End of my comments on the post that started this thread.

I think everyone should stop talking about which media outlet's head-honcho is biased. We will always have different opinions.

What you guys should attempt to find out is how the game development business is run in certain development houses and/or in certain publishing companies. Only then will you realize how this really determines which video card you should buy, and not by the way you perceive comments by a media outlet's reviewer in a review of a video card. In a media outlet's review of a video card, look at the settings used and the resultant framerates. That is basically all you need. Not the conclusion, not the odd comment about certain behavioural aspects of a video chip (the game developers know damned well better about this aspect than a media outlet video card reviewer!)... just the facts.

Unless of course, that reviewer happens to be me. :) :) :)
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Great post, Rev. If only this thread hewed closer to intelligent analysis rather than premature quips and half-baked retorts. Anyway, case closed for me. Hopefully X got something useful out of this.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: otispunkmeyer
Originally posted by: Drayvn
Originally posted by: munky
Originally posted by: Ackmed
Originally posted by: BenSkywalker
Rollo, care to explain to me now a 6800U 512meg is faster than a X850XT/PE 256meg

Reading comprehension needs to be bumped to one of our top national priorities-

2- 512MB 6800 Ultras
2- 256MB 6800 Ultras
2- 256MB 6800 GTs
1- X850XTPE

The X850XTPE is fourth is what Rollo was saying- actually he quite clearly stated just that.

He did not specify SLI, just said, "Edit: Oops. I forgot that 512MB 6800Us are available now, so shift everything down one.
"4th best" is not the "best of everything"."

So perhaps you should take your own advice. As you can se, he did not specify SLI. In fact, SLI is no where in his edit with the 512 6800U comment.

About the whole memory thing, I noticed that cards are usually best when combined with their original amount of video ram. For example, 64mb on the gf4200, 128mb on the 9800p or 6600gt, and 256mb for x800 series/6800 series. Adding more memory to the above cards usually doesnt help because they're not fast enough to use it effectively. So, applying the same pattern, a 512mb 6800 or x800 card would be a waste of money.

Also Munky and everyone else, i dont know if you read up on what difficulties ATi had making their 512Mb cards, but they had to do some quite major changes to their architecture to decrease the latency times on their cards. Because the only way they could fit 512Mb to the card was using double sided RAM. And the latency between each side was very high, so they had to do a lot of work for it to be working well with their performances.

To me, i know that nVidia gives their vendors a lot of room to work on how they wanna make their cards and what they put on it. The thing is vendors dont have that type of stuff to change the architecture and memory controllers to work well with double sided RAM that i expect the nVidia 512Mb cards have. Unless im totally wrong.

And also, if im right nVidia havent released a 512Mb card, only their vendors.


yeah i remember reading this, they had to do some complicated work to bring the latencies back down, their card/pcb desgin just wasnt designed to accommodate 512mb of memory. Nvidia on the other hand had the traces for 512mb Vram in the desgin from the kick off

Aha!

Thanks for clearing that up otis, i didnt know that nVidia had room for 512Mb already on their cards and had plans for it also for their vendors, cheers for the heads up!

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ackmed
Why dont people understand that its NOT just ATi owners that get the shaft. Its the NV 5200-5950 owners that get screwed as well? The 6x series doesnt come NEAR as close to a user base as the FX users do. Cards that can do PS/SM3.0 are in a very, very, very small number, compared to cards than can do PS2.0, for now.

While I am happy they are programming for the future, screwing the past card owners is just bad buis. And before someone comes in and says "upgrade your old 2003 tech", tell me why they supported 2000 tech instead? I have yet to see a reason from Ubisoft about this. My guess (hope) is that they add PS2.0 in a patch later down the road. Not for me, as I dont play the game, but for the many, many users who would benefit from this.

Most of us understand that there are costs associated with investing in old tech, or refusing to update to new tech.

SM3 has been MSs standard for a year now. I'm guessing console sales of the game are larger than PC sales, and they did SM1.1 first.

Again guessing, they probably made a business decision and balanced increased programming time/time to market against potential loss of sales to non SM3 card owners. They probably assumed people who really wanted the game with SM2 cards would buy it anyway and deal with the banding, etc..

In any case, all this howling about "the injustice of it all" is ridiculous. Developers are free to code for the past/present as they see fit. Gamers are free to buy hardware for the past/present as they see fit.

Anyone who bought a X800 anything can say nothing about this, because you (hopefully)knew going in to your purchase that the checklist of hardware features on those cards is sorely lacking and hasn't really been changed since 2003. You all ignored this, and assumed developers would try to retro-code everything to work with DX9 SM2"B" because "it could be done".

Guess what? Like I've (and others) been telling you for a year, programming games is a business, nVidia has the developer relations, and there are going to be things you don't get to see because you thought 2003 is "good enough" for 2005.

Tired of saying "Vivendi has no soft shadows for me in Riddick?", "Far Cry has no HDR for me", and "SC:CT looks like crap, all banded"? Sell your X800 while it still has some value and buy a 6800Ultra.
$425 for a Leadtek 6800U
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Ackmed
Originally posted by: BenSkywalker
He did not specify SLI, just said, "Edit: Oops. I forgot that 512MB 6800Us are available now, so shift everything down one.
"4th best" is not the "best of everything"."

So perhaps you should take your own advice. As you can se, he did not specify SLI. In fact, SLI is no where in his edit with the 512 6800U comment.

And I say yet again, reading comprehension has tanked-

Edit: Oops. I forgot that 512MB 6800Us are available

I bolded the relevant part. He did clearly implicate that he was talking about a SLI setup- it is clear the 's' wasn't accidental as he followed it with 'are' instead of 'is'.

I dont like to assume what other people mean. Its a bad habit to do.

As I said, he didnt say 512 6800U SLI, so I didnt assume he ment it. Assuming can get you in trouble. Get over yourself, and move on.

Originally posted by: keysplayr2003


It would have been nice if you had the presence of mind to just "know" what he meant. We all did. You did not.

Read up.

Your hopeless dude. Not a nice bone in your body. Your tone is always malignant.

 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: housecat
Actually I do remember the thread, thats why I brought it up.
But popular opinion is not always right Creig. Just because you can get ppl to click "ATI" in a poll doesnt make it right.

I don't recall anybody putting a gun to people's heads and telling them to vote ATI. They did it on their own because the vast majority of people would take the X850XT PE over the 6800U. Seems pretty clearcut.


If you read the thread, you'll find the most rational, sound and logical ones to lean towards the Ultra.

????? I can't even BEGIN to understand where you came up with THAT statement.


I lean more along the lines of the Ultra, but between those two.. a trade isnt really necessary. I'd prob bother trading the X850 for the Ultra, but not the other way around.

And for every person who feels the way you do, there are TWO people who feel just the opposite.


Guys that use X850XT PEs are prob likely to be upgrading next gen anyway.

The X850XT PE has enough raw horsepower to last for a couple of years if need be.


But I never would get on myself, I'd rather have dual 6800GTs or Ultras.

That's your personal preference and there's nothing wrong with that.


I gotta get back in Warcraft, but your benchmarks are from TOMS for jeebuz christ man.
Get that out of here.
edit- Rollo bothered to actually look at your TOMS benchmarks. I didnt, sounds like they were a bad choice.. 1024x768? Whats wrong with you?
No one buys that level of stuff to play at 1024.

I saw Rollo quoting benchmarks from THQ so I thought I'd do the same. I just included ALL of them and tallied them up, that's all. I didn't make them.


Creig, you show so much bias its sick. I may like what I like, but I'm not going to turn into some deceptive creep to make NV look better than it really is. Knock that crap off.

Don't even THINK of accusing me of bias when you're so pathetically pro-NV/anti-ATI. When people ask what card to buy, I give them my honest recommendation which is currently an even mix of Nvidia and ATI products. Check my past posts if you don't believe me.

I base my opinions on reviews, benchmarks and people's personal experiences. Not by looking at the box for a "The Way It's Meant To Be Played" sticker the way you do.
 
Jun 14, 2003
10,442
0
0
Originally posted by: Ackmed
Why dont people understand that its NOT just ATi owners that get the shaft. Its the NV 5200-5950 owners that get screwed as well? The 6x series doesnt come NEAR as close to a user base as the FX users do. Cards that can do PS/SM3.0 are in a very, very, very small number, compared to cards than can do PS2.0, for now.

While I am happy they are programming for the future, screwing the past card owners is just bad buis. And before someone comes in and says "upgrade your old 2003 tech", tell me why they supported 2000 tech instead? I have yet to see a reason from Ubisoft about this. My guess (hope) is that they add PS2.0 in a patch later down the road. Not for me, as I dont play the game, but for the many, many users who would benefit from this.


to be honest, GForce FX owners got the shaft on SM2 also

nvidia has ubi soft in that TWIWMTBP scheme, so i presume nvidia would of prefered it do things the way nvidia wanted

the FX series is terrible at SM2, and cant do SM3.....so SM1.1 makes sense
the 6 series is good at SM2, but can also run decently with extra features provided by SM3 so given the performance is acceptable...SM3 is the choice (SM3 is also a selling point of the card, so nvidia are just giving their consumers something with which they can exploit this selling point)

and to be honest, if youve got a big company giving you money, you dont turn around and tell them to cock off when they tell you what they want doing.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Creig, being that the 6800U and X800XTPE are very close to each other in performance, give or take depending on game, which card do you think is the better card?

Disclaimers:

Not considering price
Not considering speed
Not considering SLI
Not considering Bang for Buck

Objective: To see if Creig is honest with himself and us.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Originally posted by: otispunkmeyer
Originally posted by: Ackmed
Why dont people understand that its NOT just ATi owners that get the shaft. Its the NV 5200-5950 owners that get screwed as well? The 6x series doesnt come NEAR as close to a user base as the FX users do. Cards that can do PS/SM3.0 are in a very, very, very small number, compared to cards than can do PS2.0, for now.

While I am happy they are programming for the future, screwing the past card owners is just bad buis. And before someone comes in and says "upgrade your old 2003 tech", tell me why they supported 2000 tech instead? I have yet to see a reason from Ubisoft about this. My guess (hope) is that they add PS2.0 in a patch later down the road. Not for me, as I dont play the game, but for the many, many users who would benefit from this.


to be honest, GForce FX owners got the shaft on SM2 also

nvidia has ubi soft in that TWIWMTBP scheme, so i presume nvidia would of prefered it do things the way nvidia wanted

the FX series is terrible at SM2, and cant do SM3.....so SM1.1 makes sense
the 6 series is good at SM2, but can also run decently with extra features provided by SM3 so given the performance is acceptable...SM3 is the choice (SM3 is also a selling point of the card, so nvidia are just giving their consumers something with which they can exploit this selling point)

and to be honest, if youve got a big company giving you money, you dont turn around and tell them to cock off when they tell you what they want doing.

just to let everyone know (not that you're saying this or not) the 6series did NOT pull of a Sm2 trick like the FX series. The 6series has full SM3.0 and can do everything with it that SM3 can do...the next gen ATI cards will of course have optimized Sm3.0, because it will have been around longer then. Someone said that ATI is "correctly" using SM3.0, and that the 6xxx cards dont...that isn't true. 6xxx cards use it as it should be (not compared to the FX cards like you might think), BUT the ATI cards will have it optimized...oh, and no one said that on the forums (that i am aware of) so im not saying that any of you said that.

To be honest, the x800 card owners will have to fork out money for next gen games sooner than the 6600gt-6800ultra card owners...that does not mean that the 6xxx cards will run UE3 games amazingly, but better than x800 cards...If it were up to me, i'd save up for next gen ATI/Nvidia cards because they'll be awesome (i hope), but My computer is limited and so is my money.

Time to actually do school work...:(

 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: keysplayr2003
Creig, being that the 6800U and X800XTPE are very close to each other in performance, give or take depending on game, which card do you think is the better card?

Disclaimers:

Not considering price
Not considering speed
Not considering SLI
Not considering Bang for Buck

Objective: To see if Creig is honest with himself and us.

So if I can't take into consideration "Price", "Speed", "SLI", or "Bang for Buck", what exactly am I supposed to base my decision on? PCB color? Prettiest fan sticker? What?

As I said, I base my opinions on all relevent factors, not just selected highlights.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Creig
Originally posted by: keysplayr2003
Creig, being that the 6800U and X800XTPE are very close to each other in performance, give or take depending on game, which card do you think is the better card?

Disclaimers:

Not considering price
Not considering speed
Not considering SLI
Not considering Bang for Buck

Objective: To see if Creig is honest with himself and us.

So if I can't take into consideration "Price", "Speed", "SLI", or "Bang for Buck", what exactly am I supposed to base my decision on? PCB color? Prettiest fan sticker? What?

As I said, I base my opinions on all relevent factors, not just selected highlights.

Nevermind. if you insist on playing dumb, we have nothing further to discuss. LOL PCB color. What a joke. And yet again, you squirm to avoid answering a direct question with more questions. Most of us know, and are pretty convinced what your answer should have been. So it wasn't entirely necessary for you to answer at all. It was semi'rhetorical I guess.

Because actually, speed is "equivalent" on a 6800U and a X800XTPE
Because actually, price is "equivalent" on a 6800U and a X800XT (X800XTPE not offered in PCI-E. The x800XT PCI-E 500/1000, X800XTPE 520/1120. So the Plain XT is slower yet the same price as 6800U.) Both PCI-E
So there is no bang for buck issue to be had.
And SLI is in NV's favor so I said don't bother factoring that in.

I didn't say to ignore these things, just don't factor them in because they have no real differences, besides SLI which you can omit completely.

So now that you know that the "disclaimers" are there even more so in ATI's favor, what say you now? Which is the better card? I'm not asking IF all other things are equal. I'm saying they ARE equal.

 

Todd33

Diamond Member
Oct 16, 2003
7,842
2
81
Originally posted by: Gstanfor
Originally posted by: Todd33
SM3.0 is the FSAA of the GF2 era.

1)that obviously hasnt stopped Ubisoft, i will remind you that Splinter cell is SM1.1 (which even gf3's and radeon 8500's can do) or its SM3.0 (which only geforce 6 series can do) in this game ATI owners have nothing special over much older hardware bar pure speed, thats it. its just tumped up old technology.

So one company gets paid by Nvidia and leaves out the standard, 2.0, and that's a win for who? Nvidiots? I guess you have to brag about something, even if it's not much.

This post is typical of misinformed people totally getting the facts surrounding Sm3.0 wrong.

Sm3.0 is NOT "nVidia's model" or "not the standard". It has been the standard since DX9.0c was released, since it is Microsoft's model, which nVidia has implimented and Ati has not.

I know, but who gets to gloat by paying Ubi for using SM3.0 and dropping the 2.0, which is supported by 50x as many cards? Nvidia. so till next gen, SM3.0 is a) Slow b)Marketing for Nvidia.
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
P.S 3.0 isnt slow for me, there is benches here on a thread show only a few FPS drop from p.s 2.0 to p.s 3.0 and then p.s 3.0 and HDR on, if you aint got a card to see infront of you, you wont have a clue.

Here downlaod this movie, im not sure if you will see effects on ATI cards>>>>

http://www.ngohq.com/files.php?go=giveme&dwn_id=30

Its new HL2 add on called Lost Coast .
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: keysplayr2003
Nevermind. if you insist on playing dumb, we have nothing further to discuss. LOL PCB color. What a joke. And yet again, you squirm to avoid answering a direct question with more questions. Most of us know, and are pretty convinced what your answer should have been. So it wasn't entirely necessary for you to answer at all. It was semi'rhetorical I guess.

Because actually, speed is "equivalent" on a 6800U and a X800XTPE
Because actually, price is "equivalent" on a 6800U and a X800XT (X800XTPE not offered in PCI-E. The x800XT PCI-E 500/1000, X800XTPE 520/1120. So the Plain XT is slower yet the same price as 6800U.) Both PCI-E
So there is no bang for buck issue to be had.
And SLI is in NV's favor so I said don't bother factoring that in.

I didn't say to ignore these things, just don't factor them in because they have no real differences, besides SLI which you can omit completely.

So now that you know that the "disclaimers" are there even more so in ATI's favor, what say you now? Which is the better card? I'm not asking IF all other things are equal. I'm saying they ARE equal.

If everything else were equal, then OBVIOUSLY the best choice would be the card with best/most features. ie-the 6800U. If you had bothered looking at any of my previous recommendations your rhetorical question would have been unnecessary.

$200 range = 6600GT
$300 range PCI-E = X800XL
$300 range AGP = 6800GT
$400+ range = either X850XT PE or 6800U.

What exactly do you find so biased about my recommendations?


Now how about YOU answer a question. Based on the majority of benchmarks, what is currently the fastest single video card available? You, Rollo, housecat and other Nv fans seem to either ignore this one or change the subject when it's brought up.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Creig
Originally posted by: keysplayr2003
Nevermind. if you insist on playing dumb, we have nothing further to discuss. LOL PCB color. What a joke. And yet again, you squirm to avoid answering a direct question with more questions. Most of us know, and are pretty convinced what your answer should have been. So it wasn't entirely necessary for you to answer at all. It was semi'rhetorical I guess.

Because actually, speed is "equivalent" on a 6800U and a X800XTPE
Because actually, price is "equivalent" on a 6800U and a X800XT (X800XTPE not offered in PCI-E. The x800XT PCI-E 500/1000, X800XTPE 520/1120. So the Plain XT is slower yet the same price as 6800U.) Both PCI-E
So there is no bang for buck issue to be had.
And SLI is in NV's favor so I said don't bother factoring that in.

I didn't say to ignore these things, just don't factor them in because they have no real differences, besides SLI which you can omit completely.

So now that you know that the "disclaimers" are there even more so in ATI's favor, what say you now? Which is the better card? I'm not asking IF all other things are equal. I'm saying they ARE equal.

If everything else were equal, then OBVIOUSLY the best choice would be the card with best/most features. ie-the 6800U. If you had bothered looking at any of my previous recommendations your rhetorical question would have been unnecessary.

$200 range = 6600GT
$300 range PCI-E = X800XL
$300 range AGP = 6800GT
$400+ range = either X850XT PE or 6800U.

What exactly do you find so biased about my recommendations?


Now how about YOU answer a question. Based on the majority of benchmarks, what is currently the fastest single video card available? You, Rollo, housecat and other Nv fans seem to either ignore this one or change the subject when it's brought up.

I found nothing wrong with your recommendations. I just asked you a question. Nothing to do with your recommendations. To answer your question:

The fastest single card solution out there right now is the ATI X850XTPE.
For the most part and majority of games, it's the X850XTPE. I never said they were slow. ;)

 
Jun 14, 2003
10,442
0
0
Originally posted by: humey
P.S 3.0 isnt slow for me, there is benches here on a thread show only a few FPS drop from p.s 2.0 to p.s 3.0 and then p.s 3.0 and HDR on, if you aint got a card to see infront of you, you wont have a clue.

Here downlaod this movie, im not sure if you will see effects on ATI cards>>>>

http://www.ngohq.com/files.php?go=giveme&dwn_id=30

Its new HL2 add on called Lost Coast .


170mg for what amounts to a not very informative video? that was poor man! i want to play the new level not look at it!:)
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Todd33
Originally posted by: Gstanfor
Originally posted by: Todd33
SM3.0 is the FSAA of the GF2 era.

1)that obviously hasnt stopped Ubisoft, i will remind you that Splinter cell is SM1.1 (which even gf3's and radeon 8500's can do) or its SM3.0 (which only geforce 6 series can do) in this game ATI owners have nothing special over much older hardware bar pure speed, thats it. its just tumped up old technology.

So one company gets paid by Nvidia and leaves out the standard, 2.0, and that's a win for who? Nvidiots? I guess you have to brag about something, even if it's not much.

This post is typical of misinformed people totally getting the facts surrounding Sm3.0 wrong.

Sm3.0 is NOT "nVidia's model" or "not the standard". It has been the standard since DX9.0c was released, since it is Microsoft's model, which nVidia has implimented and Ati has not.

I know, but who gets to gloat by paying Ubi for using SM3.0 and dropping the 2.0, which is supported by 50x as many cards? Nvidia. so till next gen, SM3.0 is a) Slow b)Marketing for Nvidia.

Why shouldn't nVidia gloat about SM3.0? They put in the work to support it, they helped the developers support it and bring it to gamers. They have earned the right to market it and gloat about it. ATi could have done exactly the same, but they were too busy milking the R3xx architecture to worry about implimenting the standard of the day.

And nVidia's SM3.0 isn't slow - it's the fastest currently on the market...
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Originally posted by: BenSkywalker
He did not specify SLI, just said, "Edit: Oops. I forgot that 512MB 6800Us are available now, so shift everything down one.
"4th best" is not the "best of everything"."

So perhaps you should take your own advice. As you can se, he did not specify SLI. In fact, SLI is no where in his edit with the 512 6800U comment.

And I say yet again, reading comprehension has tanked-

Edit: Oops. I forgot that 512MB 6800Us are available

I bolded the relevant part. He did clearly implicate that he was talking about a SLI setup- it is clear the 's' wasn't accidental as he followed it with 'are' instead of 'is'.

[/quote]

Looking at Rollo's statement, it would make most sense that he was just simply referring to the fact that the card is available. I don't see any trace of SLI implication in that sentence. The 's' was not accidental, true. Saying that 6800s is available is incorrect grammar, just like saying that 6800 are available. He could have just as easily said that the 512MB 6800U is available, but people do refer to cards in the plural, increasingly so when speaking in past tense. Imagine that.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: keysplayr2003
I found nothing wrong with your recommendations. I just asked you a question. Nothing to do with your recommendations.

So why did you ask me your question in the first place? As I've said before, I'm a "best bang for the buck" kinda guy. Doesn't matter whether it's an Nvidia card or an ATi card.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Gstanfor
Originally posted by: Todd33
Originally posted by: Gstanfor
Originally posted by: Todd33
SM3.0 is the FSAA of the GF2 era.

1)that obviously hasnt stopped Ubisoft, i will remind you that Splinter cell is SM1.1 (which even gf3's and radeon 8500's can do) or its SM3.0 (which only geforce 6 series can do) in this game ATI owners have nothing special over much older hardware bar pure speed, thats it. its just tumped up old technology.

So one company gets paid by Nvidia and leaves out the standard, 2.0, and that's a win for who? Nvidiots? I guess you have to brag about something, even if it's not much.

This post is typical of misinformed people totally getting the facts surrounding Sm3.0 wrong.

Sm3.0 is NOT "nVidia's model" or "not the standard". It has been the standard since DX9.0c was released, since it is Microsoft's model, which nVidia has implimented and Ati has not.

I know, but who gets to gloat by paying Ubi for using SM3.0 and dropping the 2.0, which is supported by 50x as many cards? Nvidia. so till next gen, SM3.0 is a) Slow b)Marketing for Nvidia.

Why shouldn't nVidia gloat about SM3.0? They put in the work to support it, they helped the developers support it and bring it to gamers. They have earned the right to market it and gloat about it. ATi could have done exactly the same, but they were too busy milking the R3xx architecture to worry about implimenting the standard of the day.

And nVidia's SM3.0 isn't slow - it's the fastest currently on the market...

It's actually the only one on the market. hehe. ;) At least that I know of.

 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Avalon sees the light.

Originally posted by: keysplayr2003

Your hopeless dude. Not a nice bone in your body. Your tone is always malignant.


My response was far from malignant.

However, your post was.

Originally posted by: keysplayr2003

It would have been nice if you had the presence of mind to just "know" what he meant. We all did. You did not.

If I was going to be mean, I would tell you you shouldnt be using words like that, if you dont understand the simple difference between 'your' and 'you're'. Hows that? :)