The Ultimate SM3.0 Game Thread

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

lotus503

Diamond Member
Feb 12, 2005
6,502
1
76
"That's Doom 3 noob. Plus more recent benchmarks are showing that ATI's new drivers are closing the gap. Those benchmakrs were showing the 6800 Ultra Extreme whcih was announced a long time ago that they cancelled it. Not to mention ATI has better AA and AF in performance and image quality. So get your facts straight as to which card is better noob. All the major PC Mags and PC websties already show that the X800 is better. So don't debate that. Plus the X800's run better on Far Cry (a game designed to run on 6800's)"



^Fanboi.
 
Jun 14, 2003
10,442
0
0
Originally posted by: S0Lstice
"That's Doom 3 noob. Plus more recent benchmarks are showing that ATI's new drivers are closing the gap. Those benchmakrs were showing the 6800 Ultra Extreme whcih was announced a long time ago that they cancelled it. Not to mention ATI has better AA and AF in performance and image quality. So get your facts straight as to which card is better noob. All the major PC Mags and PC websties already show that the X800 is better. So don't debate that. Plus the X800's run better on Far Cry (a game designed to run on 6800's)"



^Fanboi.


blatantly so!
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: hans030390
apoppin, i had posted the list of supported games a LONG time ago, and i even mentioned that HL2 supported Sm3.0 (i think....i know i put up a link to the list though)

i guess no one read it.....but even if the 6800 doesnt do too well in next gen games, its a better choice than an x800 UNLESS you upgrade every year

i never saw your list . . . . i just got 'lucky' on an independent search . . . . .i am not surprised it was missed . . . . in this heated discussion, i had to repeat myself about a dozen times to be understood.

strange . . . we agree.

:)

and i DO believe the 6800 will do "great" with SM 3.0 enabled games UNTIL dx10 . . . IF 'history' is any indication of the 'future'. ;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: Noob
Originally posted by: keysplayr2003
Originally posted by: Bar81
Because many of you obviously failed english I'm going to go over this AGAIN:


This thread is simply trying to determine whether SM3.0 is a factor that should be legitimately considered when making a purchasing decision on this generation of cards. To wit, does having SM3.0 confer any sort of tangible advantage such that one could say that having a SM3.0 card is a "must" over a card without the feature. To determine that I am gathering a list of SM3.0 enabled games currently available and forthcoming in the near future to allow people to answer the question on their own.

For those with agendas. Two cards, same price, same performance, one is SM2.0b, the other is SM3.0. Whether you know anything about SM2.0b and SM3.0 or not, why in the land of F**k would you buy the SM2.0b card?

Thats like having two sport utility vehicles in front of you and you need to purchase one of them. They are identical for the most part except one has 2-wheel drive and the other has all-wheel drive. You tell the car salesman that you will take the two wheel drive suv because its not snowing TODAY. The snow will fall eventually Bar.

This is so simplistic it's childs play.

Because people know that the X800 woops the 6800 in performacne and image quality. And they also know SM 3.0 doesn't make an image quality difference, and nor is there the power behind the cards to take advantage of it. So it's not so noobishly simple as you state it.

I can't make it any easier for you. 6xxx series cards were designed for SM3.0+.
We aren't talking about buying a FX5200Ultra with 256MB of RAM here. That card barely had enough power to push 128. However, the 6800's are powerhouses and should be able to have lasting SM3.0 performance for years to come.

You try to make it sound like we are all umbrella salesman in the sahara desert.
Noob, what you don't realize is that we all can see your X800pro card there in your sig and understand why you desperately need to defend it. Just to make you feel better, I have a X800XTPE on its way to me. Should get it sometime next week. My mind is not as closed as yours and are willing to try something different. You on the other hand with use your last breath on your death bed to say an X800 is better than ANYthing in the world. Even sex.

 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: Bar81
Well, to add further to the issue at hand, here's an interesting quote from nvidia's financial report regarding the upcoming nvidia cards:

?Well, from an architecture standpoint we?re just still at the beginning of shader model 3.0. And we need to give the programmers out there some time to continue to really learn about that architecture. So in the spring refresh what you?ll see is a little bit faster versions...

...I think you?ll see the industry move up a little bit in performance. But I don?t think you?ll see any radical changes in architecture. I doubt you?ll see any radical changes in architecture even in the fall. When we came out with GeForce 6, we tend to create a revolutionary architecture about every actually two years. And then we derive from it for the following time. So even the devices that we announced this fall, that will be I think a lot more powerful than the ones we actually had a year ago. Architecturally we?re still in the shader model three type era.?

http://www.beyond3d.com/#news20937

I think this could be potentially devestating news for SM3.0 on current cards. If nvidia is making what amounts to a SM3.0B spec for their next generation cards what does that tell us about today's implementation of SM3.0 and it's usefullness in upcoming games played with a 6800 series card?

Or even more so, what does this say about ATI's SM2.0b?

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: hans030390
I have a 9200....i want to defend it.....it beats any card to date, and it only costs $30 :p

Aaaaiiiiiiieeeee!!!!!!!!!!!!!!!!!!!!!


9200 is the card to have?!?!

<scrambles for credit card and neweggs web page>

;)
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Yeah i get a whole 25fps on HL2 with medium settings at 800x600

wow


this thread is pointless
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
Originally posted by: Bar81
Well, to add further to the issue at hand, here's an interesting quote from nvidia's financial report regarding the upcoming nvidia cards:

?Well, from an architecture standpoint we?re just still at the beginning of shader model 3.0. And we need to give the programmers out there some time to continue to really learn about that architecture. So in the spring refresh what you?ll see is a little bit faster versions...

...I think you?ll see the industry move up a little bit in performance. But I don?t think you?ll see any radical changes in architecture. I doubt you?ll see any radical changes in architecture even in the fall. When we came out with GeForce 6, we tend to create a revolutionary architecture about every actually two years. And then we derive from it for the following time. So even the devices that we announced this fall, that will be I think a lot more powerful than the ones we actually had a year ago. Architecturally we?re still in the shader model three type era.?

http://www.beyond3d.com/#news20937

I think this could be potentially devestating news for SM3.0 on current cards. If nvidia is making what amounts to a SM3.0B spec for their next generation cards what does that tell us about today's implementation of SM3.0 and it's usefullness in upcoming games played with a 6800 series card?

Or even more so, what does this say about ATI's SM2.0b?
that's a pretty eXtreme conclusion about "3.0b" drawn from the 6900 series being a LITTLE BIT faster than the current 6800 series.

expect only ATI to make big changes to their r520 core . . . nVidia will rely on a core and memory speed bump and SLI for their top cards ;)

the 6800gt/ultra will be fine for the next year or two in SM 3.0 aps. . . .

 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
Originally posted by: keysplayr2003
Originally posted by: Noob
Originally posted by: keysplayr2003
Originally posted by: Bar81
Because many of you obviously failed english I'm going to go over this AGAIN:


This thread is simply trying to determine whether SM3.0 is a factor that should be legitimately considered when making a purchasing decision on this generation of cards. To wit, does having SM3.0 confer any sort of tangible advantage such that one could say that having a SM3.0 card is a "must" over a card without the feature. To determine that I am gathering a list of SM3.0 enabled games currently available and forthcoming in the near future to allow people to answer the question on their own.

For those with agendas. Two cards, same price, same performance, one is SM2.0b, the other is SM3.0. Whether you know anything about SM2.0b and SM3.0 or not, why in the land of F**k would you buy the SM2.0b card?

Thats like having two sport utility vehicles in front of you and you need to purchase one of them. They are identical for the most part except one has 2-wheel drive and the other has all-wheel drive. You tell the car salesman that you will take the two wheel drive suv because its not snowing TODAY. The snow will fall eventually Bar.

This is so simplistic it's childs play.

Because people know that the X800 woops the 6800 in performacne and image quality. And they also know SM 3.0 doesn't make an image quality difference, and nor is there the power behind the cards to take advantage of it. So it's not so noobishly simple as you state it.

I can't make it any easier for you. 6xxx series cards were designed for SM3.0+.
We aren't talking about buying a FX5200Ultra with 256MB of RAM here. That card barely had enough power to push 128. However, the 6800's are powerhouses and should be able to have lasting SM3.0 performance for years to come.

You try to make it sound like we are all umbrella salesman in the sahara desert.
Noob, what you don't realize is that we all can see your X800pro card there in your sig and understand why you desperately need to defend it. Just to make you feel better, I have a X800XTPE on its way to me. Should get it sometime next week. My mind is not as closed as yours and are willing to try something different. You on the other hand with use your last breath on your death bed to say an X800 is better than ANYthing in the world. Even sex.

All I do is just give the facts. I had posted 3 or 4 benchmarks showing the X800's superiority over the 6800's. And saying that I think the X800 is better then sex is just a stupid comment. I can't think of any kind of smart ass comment to say to that.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: Rollo
Originally posted by: hans030390
I have a 9200....i want to defend it.....it beats any card to date, and it only costs $30 :p

Aaaaiiiiiiieeeee!!!!!!!!!!!!!!!!!!!!!


9200 is the card to have?!?!

<scrambles for credit card and neweggs web page>

;)

SEND ME THE LINK!!!!!!!!!!!!!!!!!!!!!!! ;)

 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
Originally posted by: hans030390
uhhhhh what where those benchmarks in ALL SM 2.0 games?? huh??? thats right!!!!

Actually is was Far Cry and HL2. All SM 3.0 games. "Thats's right!"

 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
At least you finally admited its not 2002 tech. Now its "mostly" 2002 tech. Heh.

"Mostly 2002 tech" that beats new 2004 single card tech? What does it matter when the "tech" was made? As shown in the first post, 3.0 is useless to a lot of people.
 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
Originally posted by: Ackmed
At least you finally admited its not 2002 tech. Now its "mostly" 2002 tech. Heh.

"Mostly 2002 tech" that beats new 2004 single card tech? What does it matter when the "tech" was made? As shown in the first post, 3.0 is useless to a lot of people.

Amen!

 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: keysplayr2003
Originally posted by: Bar81
Well, to add further to the issue at hand, here's an interesting quote from nvidia's financial report regarding the upcoming nvidia cards:

?Well, from an architecture standpoint we?re just still at the beginning of shader model 3.0. And we need to give the programmers out there some time to continue to really learn about that architecture. So in the spring refresh what you?ll see is a little bit faster versions...

...I think you?ll see the industry move up a little bit in performance. But I don?t think you?ll see any radical changes in architecture. I doubt you?ll see any radical changes in architecture even in the fall. When we came out with GeForce 6, we tend to create a revolutionary architecture about every actually two years. And then we derive from it for the following time. So even the devices that we announced this fall, that will be I think a lot more powerful than the ones we actually had a year ago. Architecturally we?re still in the shader model three type era.?

http://www.beyond3d.com/#news20937

I think this could be potentially devestating news for SM3.0 on current cards. If nvidia is making what amounts to a SM3.0B spec for their next generation cards what does that tell us about today's implementation of SM3.0 and it's usefullness in upcoming games played with a 6800 series card?

Or even more so, what does this say about ATI's SM2.0b?


It doesn't say anything about SM2.0b as it doesn't work with SM3.0; no change there. The thing we're trying to figure out is if SM2.0 effects enabled through SM3.0 will result in such a huge performance hit that they will for all intents and purposes be rendered useless to the 6800 crowd such that having SM3.0 in essence becomes a non-feature. That's why when I hear stuff like the quoted from nvidia's conference call it makes me think that maybe nvidia knows this, but even if that were the case they sure as heck aren't going to admit it. Then again, it could be the case that the 6800 cards will be fine. *However*, evidence from Riddick and allegedly, Splinter Cell CT, has me concerned that the former, rather than the latter scenario may become the reality of 6800 SM3.0 support.

I think it may help to understand my point if I repeat my analogy from earlier:
If X800 users have no legs (no SM3.0 support) and the 6800 users have one leg (SM3.0 support that is essentially useless because it exacts such a performance hit that the effects enabled are unable to be utilized by 6800 users,) I fail to see how any of them is going to compete in the LA marathon (i.e. enable and play a game with advanced and demanding SM2.0 effects implemented through SM3.0.)
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: Ackmed
What does it matter when the "tech" was made?

I think that's a valid point. Whether the tech is "new" or not is irrelevant. What is relevant is whether the tech is useful and enhances the gameplay experience in today's and the near future's games.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
What good is 32-bit, when they use 16-bit most of the time?

You can make an argument for either side, with the SM3.0 stance. For ATi it obviously was not cost productive to add it, and their core was mainly just an update. So they just waited a year or so till more games supported it, and more are in development. It can be said that there are hardly any games that support it, and the ones that do, take a huge hit when graphically enhanced over 2.0.

NV on the other hand added it, and have helped games to advance faster in that aspect. Pushing forward is hardly ever a bad thing. It can also be said, that games can look better with 3.0, over 2.0, and giving the user the option (sometimes) to choose between the two gives them more control.

To me, both can be "right". I would rather have companies pushing forward though, as NV did. I would think it was easier for NV to add it, than ATi. ATi is not "copying" NV by adding 3.0 in their next card. Obviously they would add it without NV anyways. They both took different approaches, and any one of you may have very well made the same choice being behind the wheel of the company.
 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
Originally posted by: Ackmed
What good is 32-bit, when they use 16-bit most of the time?

You can make an argument for either side, with the SM3.0 stance. For ATi it obviously was not cost productive to add it, and their core was mainly just an update. So they just waited a year or so till more games supported it, and more are in development. It can be said that there are hardly any games that support it, and the ones that do, take a huge hit when graphically enhanced over 2.0.

NV on the other hand added it, and have helped games to advance faster in that aspect. Pushing forward is hardly ever a bad thing. It can also be said, that games can look better with 3.0, over 2.0, and giving the user the option (sometimes) to choose between the two gives them more control.

To me, both can be "right". I would rather have companies pushing forward though, as NV did. I would think it was easier for NV to add it, than ATi. ATi is not "copying" NV by adding 3.0 in their next card. Obviously they would add it without NV anyways. They both took different approaches, and any one of you may have very well made the same choice being behind the wheel of the company.

The truth is though that 3.0 doesn't make an image quality difference. And even if the X800 core is just a small update, it still outperfroms the 6800.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: apoppin
Originally posted by: keysplayr2003
Originally posted by: Bar81
Well, to add further to the issue at hand, here's an interesting quote from nvidia's financial report regarding the upcoming nvidia cards:

?Well, from an architecture standpoint we?re just still at the beginning of shader model 3.0. And we need to give the programmers out there some time to continue to really learn about that architecture. So in the spring refresh what you?ll see is a little bit faster versions...

...I think you?ll see the industry move up a little bit in performance. But I don?t think you?ll see any radical changes in architecture. I doubt you?ll see any radical changes in architecture even in the fall. When we came out with GeForce 6, we tend to create a revolutionary architecture about every actually two years. And then we derive from it for the following time. So even the devices that we announced this fall, that will be I think a lot more powerful than the ones we actually had a year ago. Architecturally we?re still in the shader model three type era.?

http://www.beyond3d.com/#news20937

I think this could be potentially devestating news for SM3.0 on current cards. If nvidia is making what amounts to a SM3.0B spec for their next generation cards what does that tell us about today's implementation of SM3.0 and it's usefullness in upcoming games played with a 6800 series card?

Or even more so, what does this say about ATI's SM2.0b?
that's a pretty eXtreme conclusion about "3.0b" drawn from the 6900 series being a LITTLE BIT faster than the current 6800 series.

expect only ATI to make big changes to their r520 core . . . nVidia will rely on a core and memory speed bump and SLI for their top cards ;)

the 6800gt/ultra will be fine for the next year or two in SM 3.0 aps. . . .


I can't read the guy's mind but that's what, to me, it seems he's saying. Maybe he's just being obtuse, I couldn't tell you, but the way I'm reading it, I wouldn't like what he was saying if I were a 6800 user.

If what he wanted to say was that by the time the refreshes come developers will have learned to more efficiently code in SM3.0 I would expect he would say that. Except he doesn't; he says you'll see faster versions. That makes me think a revision to the implementation or spec of SM3.0 is being contemplated. Of course, he may just not know how to properly word his thoughts but going off what he's saying, I'm inclined to believe that revisions are a coming.
 
Jun 14, 2003
10,442
0
0
Originally posted by: Noob
Originally posted by: Ackmed
What good is 32-bit, when they use 16-bit most of the time?

You can make an argument for either side, with the SM3.0 stance. For ATi it obviously was not cost productive to add it, and their core was mainly just an update. So they just waited a year or so till more games supported it, and more are in development. It can be said that there are hardly any games that support it, and the ones that do, take a huge hit when graphically enhanced over 2.0.

NV on the other hand added it, and have helped games to advance faster in that aspect. Pushing forward is hardly ever a bad thing. It can also be said, that games can look better with 3.0, over 2.0, and giving the user the option (sometimes) to choose between the two gives them more control.

To me, both can be "right". I would rather have companies pushing forward though, as NV did. I would think it was easier for NV to add it, than ATi. ATi is not "copying" NV by adding 3.0 in their next card. Obviously they would add it without NV anyways. They both took different approaches, and any one of you may have very well made the same choice being behind the wheel of the company.

The truth is though that 3.0 doesn't make an image quality difference. And even if the X800 core is just a small update, it still outperfroms the 6800.

ho ho ho

both cores are good, Nvidia had to radically change the way they think because there was no way they could of derived NV40 from NV3x, simply no way. NV40 was a completely new approach.

r420 was a doubling up, and optimization of the tried and tested R300 architecture...nothing wrong in that, but its not gonna have the life span

SM3 doesnt make too much of an image quality difference no, infact probably zero, less u count farcrys HDR, all SM3 does is make it supposedly "faster" to perform the same operations as SM2 can, and add a few things SM2 cant do. the only way IQ can benefit is when SM3 makes it possible to execute a shader at a fast enough speed to be playable, where SM2 would of been to slow to execute the same thing.

nvidia definately has ATi on features, Ati definately has Nvidia when it comes to AA and AF, and in my experiences of having both ATi and Nvidia cards i can say that IQ is just the same.

Nvidia had to pull something out the bag, to make up for the NV3x disaster, which they have, they released a beast of a card, which is the most feature rich too, if they gonna start from scratch they may aswell get in all the newest tech they can, that way they can gain more understanding of it, and in future revisions make it better

Ati on the other hand had a great design (from artx?) with the R300, they had the best products all the way up to 9800XT, there was no need for them to put alot of time an effort into a new design incorporating all the new features..it wouldnt of made sense. for them it was definately better to just take what they had, tune it up and make it faster. that way they can deliver all the performance needed, with minimal cost in the R n D, it was a wise choice to leave SM3 out for R420, its not really needed by any games out now.

nvidia were starting from a blank sheet, so while they were at it, they might as well cram it full of features, simple as that
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Unfortunately, I think that's the way he "speaks" English ;) I guess what I was trying to say is maybe he's being intentionally vague for some reason. I just don't like the way he said it, but then I might be getting all excited over nothing. I just wish nvidia would clear up the whole SM3.0 and game issue so we wouldn't have to speculate.
 

imported_Noob

Senior member
Dec 4, 2004
812
0
0
By that time the games will be so powerful and will require much higher end cards. Every year a card will come out that is 2x faster. But it fells like very 6 months because of the time it takes for the card to be available to the public, and not just the computer manufacturers.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ackmed
At least you finally admited its not 2002 tech. Now its "mostly" 2002 tech. Heh.

"Mostly 2002 tech" that beats new 2004 single card tech? What does it matter when the "tech" was made? As shown in the first post, 3.0 is useless to a lot of people.

It matters because most people (not me) who buy $400+ cards keep them a year or more.

There are already a handful of SM3 games out, and some games that utilize nV40 only features. There will be more coming out during that year.

The framerate advantage the X800 cards enjoy in some games is not significant, i.e. there is no "you can play at 16X12 on a X800, but not on a 6800U" going on. If the difference in framerate doesn't allow for a significantly different gaming experience, I don't feel it outweighs the ability to use the new features game designers are putting in to new games. (or the depreciation that is going to happen when ATI launches their own SM3/SLI capable design, soon)

That's just me though. I'd think most people would want to avoid hardware that is becoming obsolete.