• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

[NVIDIA GeForce 6800 Series Updates]

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: rbV5
Originally posted by: Jeff7181
Originally posted by: rbV5
Originally posted by: Jeff7181
Originally posted by: BFG10K
SM 3.0... displacement mapping could ad A LOT of realism to games
Displacement mapping doesn't require SM 3.0.

Show me displacement mapping in a SM 2.0... do it... now! ;) See my point?

Check this out
Hardware trilinear displacement mapping without tessellator and vertex texturing

So ATI plans on implimenting this method of displacement mapping instead of using Vertex Shader 3.0 in the x800 series? Great... I'll look forward to seeing displacement mapping in upcoming games. Would you be so kind as to reply to a PM from me in 6 months and tell me which games ATI is able to do displacement mapping in?

Implementing? The demo ran on my 9700, it should run on a 5200 as well. You asked to see displacement mapping in SM 2.0, there you go...whoopdeedoo. If SM 3.0 is a big deal that I just have to have, I'll get a card that supports it, its not like they won't be available.

Yeah... I edited my post. When ATI can showcase a game that uses displacement mapping without Vertex Shader 3.0, then I'll take that off my list of advantages the 6800 holds over the x800.
 

SilverTrine

Senior member
May 27, 2003
312
0
0
So you support ATI's decision to support SM 2.0 two years ago when there were no games that used it until two years later... but you don't support nVidia's decision to support SM 3.0 now when there might not be any games that use it until two years from now? That's rational...

It would actually be more realisitc if it was called Sm2.2 than 3.0, Nvidia zombies lucked out with the naming of it.

Now Sm1.0 to Sm2.0 that was a legitimate 1.0 leap.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Jeff7181
Personally, all I want from a video card is for it to be "fast". You want it to make coffee or something?
Seriously though... lets say you're buying an economy car... the dealer says he'll throw in a set of Z rated tires, a dealer installed body kit, and XM Radio. Are you going to say "no, take the Z rated tires off cause I don't plan on going over 80 mph and put the stock tires on, take the body kit off cause I'll never go fast enough to take advantage of the better aerodynamics, and take the XM Radio out since I only listen to my CD's in the car?"

How logical is it to refuse purchasing something because it has too many additional features?

Very, if the extra features add significantly to the price, or hurt other aspects of performance, and I'm not going to use them. The situation here is not one of exactly equal price and performance, with NVIDIA having extra features. If it was, of COURSE I would want the card with more features.

Specifically, how logical is it for you to say "SM 3.0 isn't NECESSARY" ... hey, if it gives me a 2% increase in frame rates over SM 2.0, I'll take it.
How logical is it for you to say "I don't encode video, so the video encoding capabilities don't matter?" It's there... if you EVER need to use it, you'll be able to... that won't be the case with an x800.

My position is not "I don't want features." My position is "I don't want to trade useless features for a worse price and performance." How much cheaper/cooler/higher clocked do you think the 6800 would be without SM3.0 and its video encoder chip? These things are never "free".

How logical is it for you to say "We don't know what effect displacement mapping has on performrance, so I don't want it?" That's not even 100% true... there's a video of Far Cry being played with displacement mapping, and it looks pretty fluid to me... granted that was ONE scenario that might not be indicative of performance throughout the entire game... but the feature is there... if you ever wanted to use it...

I haven't seen any numbers on what kind of performance hit hardware displacement mapping causes. I've seen a video of an NVIDIA demo, but it's impossible to judge anything from that, and the examples they used were very simple. The feature being there is meaningless if turning it on makes the game unplayably slow -- the FX5200 supports FP32 SM2.0 effects, but do you honestly think any game that heavily uses SM2.0 is going to run acceptably on an FX5200? I'm not saying it's the same situation here, but until I see some numbers, I'm not going to just blindly accept it.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: SilverTrine
So you support ATI's decision to support SM 2.0 two years ago when there were no games that used it until two years later... but you don't support nVidia's decision to support SM 3.0 now when there might not be any games that use it until two years from now? That's rational...

It would actually be more realisitc if it was called Sm2.2 than 3.0, Nvidia zombies lucked out with the naming of it.

Now Sm1.0 to Sm2.0 that was a legitimate 1.0 leap.

I agree, but I don't think 5 AT Forum members are going to change the entire industry's mind about what the version number should be.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
When ATI can showcase a game that uses displacement mapping without Vertex Shader 3.0, then I'll take that off my list of advantages the 6800 holds over the x800.

Wouldn't the 6800 still have the same advantage of SM3.0 support regardless of whether a game developer implements displacment mapping utilizing a different shader model. You just talk in circles. ATI and NV both support SM2.0...remember, if a developer implements displacement mapping in a game using a different approach, it benefits those cards also, or are we just tossing out everything <6800?
 

SilverTrine

Senior member
May 27, 2003
312
0
0
I'd be interested to see the source for that information. Last HL2 benchmark I saw they were right along side eachother.

That was a beta benchmark and its been speculated that the 6800 may even default to Dx8 mode in it because thats the default Nvidia mode because the Geforce Fx line was so poor in pixel shaders that it couldnt handle the full Dx9 mode.

Trust me if you're buying the 6800 line because you think that it'll perform well in pixel shader heavy games like Halflife 2 you will be sorely disappointed when the 'lowly' x800pro clobbers the 6800u in it.
Afterall you dont think Anand was kidding when he wrote this did you?

The bottom line is that R420 has the potential to execute more PS 2.0 instructions per clock than NVIDIA in the pixel pipeline because of the way it handles texturing.

The high clock speeds of the x800 line and their architectural advantages that Anand talks about in shader based games will mean that x800 line will absolutely dominate in pixel shader games.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Matthias99, I think what we can agree on is that each person's decision should be based on their own individual needs and wants. What's good for the goose is not necessarily good for the gander. To each his own. One man's trash is another man's treasure. Ok... so that one doesn't fit as well :D

Don't know if you checked out that displacement mapping demo that was linked to... but that method takes quite a large performance hit when going from the lowest LOD setting to the highest. The question I find myself asking is how does displacement mapping done in VS 3.0 compare in terms of quality and speed. In that demo, I switched to fullscreen, and set the LOD so that I couldn't see the displacement mapping being performed before my eyes and I moved. I was getting 28 frames per second. In a demo that's ONLY demonstrating that... it doesn't look like that method of displacement mapping is very efficient. So maybe that's the reason it's never been used... maybe VS 3.0 is much more efficient when performing displacement mapping.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: rbV5
When ATI can showcase a game that uses displacement mapping without Vertex Shader 3.0, then I'll take that off my list of advantages the 6800 holds over the x800.

Wouldn't the 6800 still have the same advantage of SM3.0 support regardless of whether a game developer implements displacment mapping utilizing a different shader model. You just talk in circles. ATI and NV both support SM2.0...remember, if a developer implements displacement mapping in a game using a different approach, it benefits those cards also, or are we just tossing out everything <6800?

That = displacement mapping ... so to answer your question, no. I don't care about "SM 3.0," I care about the features included in it, displacement mapping being the most significant in my opinion.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: SilverTrine
That was a beta benchmark and its been speculated that the 6800 may even default to Dx8 mode in it because thats the default Nvidia mode because the Geforce Fx line was so poor in pixel shaders that it couldnt handle the full Dx9 mode.

That's odd... as far as I know, my GeForce FX5900 isn't running in "DX8 mode" at default settings.

Trust me if you're buying the 6800 line because you think that it'll perform well in pixel shader heavy games like Halflife 2 you will be sorely disappointed when the 'lowly' x800pro clobbers the 6800u in it.
Afterall you dont think Anand was kidding when he wrote this did you?

Why do you think I'd buy a 6800 because I think it'll perform well in pixel shader heavy games like Half Life 2? I think now you're just assuming what you want to assume so you can join in the argument against me.

The high clock speeds of the x800 line and their architectural advantages that Anand talks about in shader based games will mean that x800 line will absolutely dominate in pixel shader games.

Absolutely dominate, huh? Are you absolutely sure? So sure that you'd bet money on it?

Maybe you should re-read that quote...

The bottom line is that R420 has the potential to execute more PS 2.0 instructions per clock than NVIDIA in the pixel pipeline because of the way it handles texturing.

"Has the potential to" doesn't mean it will 100% of the time in every case. Even if it did, if it won't do displacement mapping in a particular game I play, and another card will, the fact that it does more pixel shader instructions doesn't mean anything.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Correct me if I'm wrong... but isn't ATI's Truform VERY similar to displacement mapping? Granted, it doesn't use texture detail to create depth, but it does create more polygons in order to smooth a curve... which is the same thing displacement mapping does... create more polygons and displace them to create depth in the model.
Again, correct me if I'm wrong, but doesn't Truform create a HUGE performance hit? That would lead one to believe that it takes quite a lot of power to perform... or... it takes specially designed hardware to perform without a performance hit.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Originally posted by: Jeff7181
Correct me if I'm wrong... but isn't ATI's Truform VERY similar to displacement mapping? Granted, it doesn't use texture detail to create depth, but it does create more polygons in order to smooth a curve... which is the same thing displacement mapping does... create more polygons and displace them to create depth in the model.
Again, correct me if I'm wrong, but doesn't Truform create a HUGE performance hit? That would lead one to believe that it takes quite a lot of power to perform... or... it takes specially designed hardware to perform without a performance hit.

Truform is a form of diplacement mapping (n-patches), and it does take a hit when performed in software. ATI dropped hardware support after r200.
 

SilverTrine

Senior member
May 27, 2003
312
0
0
"Has the potential to" doesn't mean it will 100% of the time in every case. Even if it did, if it won't do displacement mapping in a particular game I play, and another card will, the fact that it does more pixel shader instructions doesn't mean anything.

Anand has to play politics this is about as definite a statement as i've ever seen by Anands review team, anyone with any sense who is capable of understanding the architectures of the x800 line and the 6800 line knows that the x800 will perform much better in pixel shader games.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Have we seen displacement mapping featured in any upcoming game? I'm curious now.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: Pete
Have we seen displacement mapping featured in any upcoming game? I'm curious now.

Far Cry "can" support it with a patch I believe. There's a video of a person playing Far Cry at what I think is an "nVidia booth" at some kind of store... so it is possible... but I don't think if you buy a 6800 right now it'll do displacement mapping in Far Cry. Far Cry needs a patch I believe, and possibly a new driver set for the 6800.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: SilverTrine
"Has the potential to" doesn't mean it will 100% of the time in every case. Even if it did, if it won't do displacement mapping in a particular game I play, and another card will, the fact that it does more pixel shader instructions doesn't mean anything.

Anand has to play politics this is about as definite a statement as i've ever seen by Anands review team, anyone with any sense who is capable of understanding the architectures of the x800 line and the 6800 line knows that the x800 will perform much better in pixel shader games.

If you read that much into that statement... then you could read into this statement...

This also indicates that performance gains due to compiler optimizations could be in NV40's future.

... and say Anand believes with driver optimizations the 6800 will gain a clear advantage over the x800.

*EDIT*
That was a beta benchmark

Yes, it was a benchmark of a game in beta (or maybe even pre-beta) using beta drivers with unreleased cards. So, what? It's a valid test when it illustrates your point, but it's invalid when it doesn't show what you want it to?
 

SilverTrine

Senior member
May 27, 2003
312
0
0
... and say Anand believes with driver optimizations the 6800 will gain a clear advantage over the x800.

You just proved yourself a fanboi more intent on spreading FUD than debating facts. The x800xt will be much faster than the 6800u in halflife 2 at the same level of detail.
I'm willing to put my money where my mouth is and make a paypal bet regarding that, you are not because you know you're wrong. If you do decide to put your money where your mouth is I'll be happy to paypal money to an honest long-term member and you do the same who will hold it upon release of Anands Halflife 2 benchmarks.
 

413xram

Member
May 5, 2004
197
0
0
Supposedly from what I read and understand is that you can test the 6800 GT at one of the nvidia gaming centers. I'm just forwarding the message that was emailed to me. Do not kill the messenger.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,056
32,578
146
Originally posted by: 413xram
Supposedly from what I read and understand is that you can test the 6800 GT at one of the nvidia gaming centers. I'm just forwarding the message that was emailed to me. Do not kill the messenger.
I got that too. Of course the closest one to me is 5+hrs away.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: SilverTrine
... and say Anand believes with driver optimizations the 6800 will gain a clear advantage over the x800.

You just proved yourself a fanboi more intent on spreading FUD than debating facts. The x800xt will be much faster than the 6800u in halflife 2 at the same level of detail.
I'm willing to put my money where my mouth is and make a paypal bet regarding that, you are not because you know you're wrong. If you do decide to put your money where your mouth is I'll be happy to paypal money to an honest long-term member and you do the same who will hold it upon release of Anands Halflife 2 benchmarks.

I won't bet any money on it cause I'm poor and would rather spend the little money I do have on more useful things. As for being wrong... you obviously misinterpreted that statement.

This whole debate started with me pointing out how silly it is for ATI owners to say how useless SM 3.0 is...

Facts:
  • There are no current games that use SM 3.0 at the moment.
    Current hardware is not fast enough to fully utilize ALL of SM 3.0's features to their full potential.
    Two years ago there were no games that used SM 2.0.
    Two years ago current hardware was not fast enough to utilize ALL of SM 2.0's features to their full potential.

Two years ago, it was a big deal that ATI supported a technology that was useless on the current hardware... yet somehow today, when it's nVidia that's supporting a technology that may prove to be useless on current hardware, it's no big deal.

Yes, the leap between SM 1.x and SM 2.x was bigger than SM 2.x to SM 3.x. I'm not arguing that... I'm just pointing out the irony of it all... ATI was great for supporting SM 2.0... and now nVidia is teh suxxors for supporting SM 3.0. Make up your mind... do you want new technology or not?
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: 413xram
Supposedly from what I read and understand is that you can test the 6800 GT at one of the nvidia gaming centers. I'm just forwarding the message that was emailed to me. Do not kill the messenger.

You made it sound as if the e-mail was telling you "don't buy our products, just go to these places and use them."
 

413xram

Member
May 5, 2004
197
0
0
No not at all. Just getting a little frustraded. Here I'm sitting and willing to pay top dollar to purchase the card, in excess of 600 dollars, and I get this email telling me where I can go play it at. This is not the early 80's when I had to go to the arcade to play Galaga or Centipede, I'm a little to old for that now.

By the way is the Voodoo 6000 ready for release? LOL