• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

[NVIDIA GeForce 6800 Series Updates]

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

413xram

Member
May 5, 2004
197
0
0
By the way. I agree totally with you on your stance with EA. They ruined my online game, Air WarriorIII. Jerk offs!
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: 413xram
By the way. I agree totally with you on your stance with EA. They ruined my online game, Air WarriorIII. Jerk offs!

We'll see if they redeem themselves with Battlefield 2... looks pretty nice in the July CGW Magazine.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Jeff7181
Originally posted by: Pete
Have we seen displacement mapping featured in any upcoming game? I'm curious now.

Far Cry "can" support it with a patch I believe. There's a video of a person playing Far Cry at what I think is an "nVidia booth" at some kind of store... so it is possible... but I don't think if you buy a 6800 right now it'll do displacement mapping in Far Cry. Far Cry needs a patch I believe, and possibly a new driver set for the 6800.

the video is from a demo setup at the nvidia booth at e3 which was playable by the public.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: SilverTrine
Yeah eat the pr hook line and sinker and ignore the facts of the situation: Anand among others have pointed out that the 6800u is the old style gpu design and will be far inferior in pixel shader games.
See what neophytes dont understand is that Sm3.0 is just a featureset, the X800 architecture is designed to do well in pixel shader games from the ground up the 6800u is not.

If people buy the 6800 because they think the Sm3.0 featureset will give them greater performance in pixel shader games they'll be sorely upset when halflife 2 comes out and the x800 gets 25-40% better performance.

you're funny :)
 

ChkSix

Member
May 5, 2004
192
0
0
The x800xt will be much faster than the 6800u in halflife 2 at the same level of detail.

Not according to Xbit Labs early reviews on both 6800 and X800 that show them neck and neck throught all resolutions (3dc aside) when running the HL2 benchmark. Ok it wasn't the XT, but then again you're speculating and so am I. And by the way, HL2 is one game that in my own opinion, takes a beating from the more visually gorgeous game known as S.T.A.L.K.E.R.

See what neophytes dont understand is that Sm3.0 is just a featureset, the X800 architecture is designed to do well in pixel shader games from the ground up the 6800u is not

Where did you get this information from? According to Anandtech's initial review, the ATi is never more than 10fps ahead of the 6800 in ANY test when it does takes the lead. Xbit, which tests the cards in many more modern gaming benchmarks, does have similiar findings as Anandtech by the way. And after what has been uncovered recently regardingt ATi's filtering techniques, I wouldnt even bet on this being a solid case or seeing any other further 'significant' leads down the road. Some are still thinking Nvidia has a NV3x in it's pocket, and that is a sad assumption to have.

And if you think you're gonna notice 5-10fps difference when a game is already over 32+ in 1600x1200, you're sadly mistaken.
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
All you ATI fans are very funny, just because the latest ATI video cards are STILL using SM2.0 (how old fashioned!) you keep putting it down. I bet that if the latest ATI video cards also had SM3.0 then all of you wouldn't be so quick to call it pointless!

Don't you realise just how pathetic that makes all of you ATI fans look, oh we don't have SM3.0 so it must be useless, yeah right! :laugh:
 

ChkSix

Member
May 5, 2004
192
0
0
Again, if it were soo useless, ATi would not be looking to implement it in the R500. Go figure. This might come out as a 'fanboy' comment, but it looks to me that ATI had to first see it implemented correctly so that they can go ahead and 'dissect' the competitor's card to figure out how to do it themselves. By the way, this is common practice by every business around the globe. Do not think there is an exception here.

Truth of the matter is that they couldnt implement 3.0 into the R420 because it simply does not have the power to do it.

Six different clock speeds on the same card given to different reviewers says what again to you? A coincidence or an attempt to take a lead through a simple modification? I might agree with those that side with ATi so strongly if the performance was a landslide difference over the NV40. However that isn't even the reality of the situation, and isn't worth my hard earned dollars, not this time around.

But whatever floats your boat is good enough for me. I never said the X800 wasn't a very good card, it just isn't the right card for me and certain others here. Hence the debate rages on.

Hehe
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Jeff7181
This whole debate started with me pointing out how silly it is for ATI owners to say how useless SM 3.0 is...

Facts:
  • There are no current games that use SM 3.0 at the moment.
    Current hardware is not fast enough to fully utilize ALL of SM 3.0's features to their full potential.
    Two years ago there were no games that used SM 2.0.

Okay. The problem comes in here:

  • Two years ago current hardware was not fast enough to utilize ALL of SM 2.0's features to their full potential.

I disagree with this statement. The RADEON 9700Pro/9800/9800Pro and NVIDIA 5900 cards, at least, have enough horsepower to run Far Cry (a game which uses SM2.0 shaders moderately), and are supposed to be good enough to run Doom3, HL2, and similar games making heavy use of SM2.0 (although perhaps not at super-high resolutions and with all the quality features cranked up). But this is always the case with new features -- you could say that the GeForce4 and R8500 weren't really fast enough to use all of DX8.1's features to their full potential, either.

Two years ago, it was a big deal that ATI supported a technology that was useless on the current hardware... yet somehow today, when it's nVidia that's supporting a technology that may prove to be useless on current hardware, it's no big deal.

IMO, ATI put out a usable implementation of SM2.0 with the R350, which was (and still is) very clearly the direction that the gaming industry is moving in. However, they got hosed by the fact that the two major titles that were supposed to be using it have been repeatedly delayed. NVIDIA, on the other hand, has put out a card which seems to focus on SM3.0 -- the benefits of which over SM2.0 still seem awfully unclear at this point -- and possibly done so at the expense of SM2.0 performance (which is now starting to become increasingly important, and which will continue to be important for some time).

Is it possible that NVIDIA has, in fact, made the right choice, and I'm completely wrong? Yes. But, in my opinion, they screwed up. We'll just have to see what happens.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: ChkSix
Again, if it were soo useless, ATi would not be looking to implement it in the R500. Go figure. This might come out as a 'fanboy' comment, but it looks to me that ATI had to first see it implemented correctly so that they can go ahead and 'dissect' the competitor's card to figure out how to do it themselves. By the way, this is common practice by every business around the globe. Do not think there is an exception here.

Truth of the matter is that they couldnt implement 3.0 into the R420 because it simply does not have the power to do it.

Again, it couldn't possibly be that they COULD have done it, but it would have increased heat and power consumption, and possibly compromised SM2.0 performance, for no appreciable gains in the next 12 months? It MUST be that they're inept (since they need to copy NVIDIA's design) and their hardware design sucks... not that they think they can better spend their R&D money on R500, which will actually be fast enough to make use of SM3.0.

I don't think SM3.0 is bad or 'useless'; I think NVIDIA's implementation of it is premature, and the current generation of cards won't really be able to reap its benefits. This is, again, just my opinion, and time will tell, I'm sure.

Six different clock speeds on the same card given to different reviewers says what again to you? A coincidence or an attempt to take a lead through a simple modification?

Or it being prerelease hardware? What about the 61.11 NVIDIA drivers (you know, the ones they sent certain sites for benchmarking?) apparently forcing brilinear on even when you 'disable' it? Coincidence, or an attempt to make their card's performance look better?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: SilverTrine
... and say Anand believes with driver optimizations the 6800 will gain a clear advantage over the x800.

You just proved yourself a fanboi more intent on spreading FUD than debating facts. The x800xt will be much faster than the 6800u in halflife 2 at the same level of detail.
I'm willing to put my money where my mouth is and make a paypal bet regarding that, you are not because you know you're wrong. If you do decide to put your money where your mouth is I'll be happy to paypal money to an honest long-term member and you do the same who will hold it upon release of Anands Halflife 2 benchmarks.

To put SilverTrine's posts into persepctive...

He posted this in a thread about about drivers in regards to ATi's OpenGL performance...

Originally posted by: SilverTrine
...Theres no problems with Ati cards in OpenGl games, games like RTCW actually ran better on Ati cards than Nvidia cards. So the person who wrote this is simply wrong unless he clarifies his statement intensively...
link to thread

Draw your own conclusions as to how knowledgeable and/or unbiased he is on this topic...
 
Apr 14, 2004
1,599
0
0
I won't bet any money on it cause I'm poor and would rather spend the little money I do have on more useful things. As for being wrong... you obviously misinterpreted that statement.

This whole debate started with me pointing out how silly it is for ATI owners to say how useless SM 3.0 is...

Facts:

There are no current games that use SM 3.0 at the moment.
Current hardware is not fast enough to fully utilize ALL of SM 3.0's features to their full potential.
Two years ago there were no games that used SM 2.0.
Two years ago current hardware was not fast enough to utilize ALL of SM 2.0's features to their full potential.



Two years ago, it was a big deal that ATI supported a technology that was useless on the current hardware... yet somehow today, when it's nVidia that's supporting a technology that may prove to be useless on current hardware, it's no big deal.

1 big difference. With or without PS 2.0 support, the 9700 Pro crushed Nvidia anyway even in older games. But this time its the card with less "features" that is faster.

This might come out as a 'fanboy' comment, but it looks to me that ATI had to first see it implemented correctly so that they can go ahead and 'dissect' the competitor's card to figure out how to do it themselves.
They figured out how to implement PS2.0 well before Nvidia did, didn't they? It's more like they chose not to for cost/performance reasons.
 

ChkSix

Member
May 5, 2004
192
0
0
General, the card isn't that much faster in any benchmark across the board. Like I said earlier, if it was a landslide victory I would be in total agreement with you and others regarding the R420. However a 10-15 fps advantage in high resolutions when both cards are already producing 80+fps is hardly 'faster' or 'better' in my opinion. And when you think about what it lacks (at least I do) then that is the only reason why I would choose Nvidia over ATi here. Other than that, I have no brand loyalty and would buy either, as long as it gave me blazing performance and support for future technology now (not in another card I have to spend an additonal 500 dollars on).

LOL nice post Nitro. Something to think about.

Go Away Matthias!!!! HAHAHAH j/k! You always get me probing and thinking further, which is a good thing. :D
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
I disagree with this statement. The RADEON 9700Pro/9800/9800Pro and NVIDIA 5900 cards, at least, have enough horsepower to run Far Cry (a game which uses SM2.0 shaders moderately), and are supposed to be good enough to run Doom3, HL2, and similar games making heavy use of SM2.0 (although perhaps not at super-high resolutions and with all the quality features cranked up). But this is always the case with new features -- you could say that the GeForce4 and R8500 weren't really fast enough to use all of DX8.1's features to their full potential, either.

And I disagree with that. :) I don't think the 9700 Pro, nor GeForceFX cards have enough "horsepower" to run Far Cry... I have to run everything at medium (some low) to get the frame rates I'd like, and at times, they still drop too low. Based on anand's benchmarks, the 9700 Pro is only a few frames per second higher than mine... so I very highly doubt I'd find the 9700 Pro acceptable. 9800 Pro and XT... MAYBE... but not the 9700 Pro.
 
Apr 14, 2004
1,599
0
0
Well, I did some math and compared the X800XT to the 6800 GT (the best price/performance cards), averaging out all the benchmarks.

On anandtech, the X800XT averaged 17% faster performance for a 20% cost premium.
On toms hardware the X800XT had a 28% lead.
On xbitlabs the X800XT had a 26% lead.

So it does seem that the X800XT offers slightly better price/performance, or slightly worse, depending on who you believe. The kicker that throws me in Nvidia's direction currently is that the GT can be overclocked at the XT supposedly cannot. The question is whether any potential benefit later on (SM 3.0, etc, etc), while only being potential at best now, is worth giving up 5-10% price/performance on. I agree with you: buying a GT and overclocking it gives enough performance now to be the better buy than the XT.

The thing is, the cards are so equal this time around that even the smallest change can sway me 1 way or another. But if the X800 XT comes down to $450 again, I'd buy it in a heartbeat.

You might want to clear up your example though. 15 fps out of 80 is pretty big: the gap in tests was far closer.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: GeneralGrievous
Well, I did some math and compared the X800XT to the 6800 GT (the best price/performance cards), averaging out all the benchmarks.

On anandtech, the X800XT averaged 17% faster performance for a 20% cost premium.
On toms hardware the X800XT had a 28% lead.
On xbitlabs the X800XT had a 26% lead.

So it does seem that the X800XT offers slightly better price/performance, or slightly worse, depending on who you believe. The kicker that throws me in Nvidia's direction currently is that the GT can be overclocked at the XT supposedly cannot. The question is whether any potential benefit later on (SM 3.0, etc, etc), while only being potential at best now, is worth giving up 5-10% price/performance on. I agree with you: buying a GT and overclocking it gives enough performance now to be the better buy than the XT.

The thing is, the cards are so equal this time around that even the smallest change can sway me 1 way or another. But if the X800 XT comes down to $450 again, I'd buy it in a heartbeat.

You might want to clear up your example though. 15 fps out of 80 is pretty big: the gap in tests was far closer.

How much does a GT cost?
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: nitromullet
Originally posted by: GeneralGrievous
Well, I did some math and compared the X800XT to the 6800 GT (the best price/performance cards), averaging out all the benchmarks.

On anandtech, the X800XT averaged 17% faster performance for a 20% cost premium.
On toms hardware the X800XT had a 28% lead.
On xbitlabs the X800XT had a 26% lead.

So it does seem that the X800XT offers slightly better price/performance, or slightly worse, depending on who you believe. The kicker that throws me in Nvidia's direction currently is that the GT can be overclocked at the XT supposedly cannot. The question is whether any potential benefit later on (SM 3.0, etc, etc), while only being potential at best now, is worth giving up 5-10% price/performance on. I agree with you: buying a GT and overclocking it gives enough performance now to be the better buy than the XT.

The thing is, the cards are so equal this time around that even the smallest change can sway me 1 way or another. But if the X800 XT comes down to $450 again, I'd buy it in a heartbeat.

You might want to clear up your example though. 15 fps out of 80 is pretty big: the gap in tests was far closer.

How much does a GT cost?

We'll let you know when it's available for purchase :)
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Jeff7181
Originally posted by: nitromullet
Originally posted by: GeneralGrievous
Well, I did some math and compared the X800XT to the 6800 GT (the best price/performance cards), averaging out all the benchmarks.

On anandtech, the X800XT averaged 17% faster performance for a 20% cost premium.
On toms hardware the X800XT had a 28% lead.
On xbitlabs the X800XT had a 26% lead.

So it does seem that the X800XT offers slightly better price/performance, or slightly worse, depending on who you believe. The kicker that throws me in Nvidia's direction currently is that the GT can be overclocked at the XT supposedly cannot. The question is whether any potential benefit later on (SM 3.0, etc, etc), while only being potential at best now, is worth giving up 5-10% price/performance on. I agree with you: buying a GT and overclocking it gives enough performance now to be the better buy than the XT.

The thing is, the cards are so equal this time around that even the smallest change can sway me 1 way or another. But if the X800 XT comes down to $450 again, I'd buy it in a heartbeat.

You might want to clear up your example though. 15 fps out of 80 is pretty big: the gap in tests was far closer.

How much does a GT cost?

We'll let you know when it's available for purchase :)

That was my point exactly, you can't do a price/performance comparison if you don't know how much one of the items being compared costs.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: nitromullet
Originally posted by: Jeff7181
Originally posted by: nitromullet
Originally posted by: GeneralGrievous
Well, I did some math and compared the X800XT to the 6800 GT (the best price/performance cards), averaging out all the benchmarks.

On anandtech, the X800XT averaged 17% faster performance for a 20% cost premium.
On toms hardware the X800XT had a 28% lead.
On xbitlabs the X800XT had a 26% lead.

So it does seem that the X800XT offers slightly better price/performance, or slightly worse, depending on who you believe. The kicker that throws me in Nvidia's direction currently is that the GT can be overclocked at the XT supposedly cannot. The question is whether any potential benefit later on (SM 3.0, etc, etc), while only being potential at best now, is worth giving up 5-10% price/performance on. I agree with you: buying a GT and overclocking it gives enough performance now to be the better buy than the XT.

The thing is, the cards are so equal this time around that even the smallest change can sway me 1 way or another. But if the X800 XT comes down to $450 again, I'd buy it in a heartbeat.

You might want to clear up your example though. 15 fps out of 80 is pretty big: the gap in tests was far closer.

How much does a GT cost?

We'll let you know when it's available for purchase :)

That was my point exactly, you can't do a price/performance comparison if you don't know how much one of the items being compared costs.

MSRP is supposed to be $400 I guess... but look at the x800 Pro... the MSRP of that is $400 as well and only a few places are actually selling it for that.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
i bought an x800pro. it's the fastest card available, bar none.

when the XT is available, i doubt i'll get it. it offers no additional features, and to me the small performance advantage it holds over the PRO doesn't justify the $100 premium in price, especially considering the overclockability in the PRO and the lack of it in the XT. despite the extra quad, performance is pretty comparable.

instead, i'll probably get a 6800. it's comparable in performance, even if at this time it's a bit slower, it offers more features which gives it a bit more potential.

who knows who will wear the performance crown when all is revealed, but regardless both offer impeccable performance, and while it remains to be realized, nv40 offers far more potential for performance and feature improvements then the r420.
 

Dean

Platinum Member
Oct 10, 1999
2,757
0
76
Deciding between ATI and Nvidia now is a tough thing to do for me. Here is the crux!

ATI is almost always slower at OGL games.
ATI is very fast at DX games.
ATI is very fast at PS2.0 games.
ATI is built on extended proven performance capabilities.

Nvidia is always faster at OGL
Nvidia is generally slower at DX games at high res/quality.
Nvidia is almost always slower at PS2 shader games.
Nvidia offers the more robust feature set.
Nvidia offers more future compatibility.

When I sum it all up, I have an almost guaranteed good ps.2.0 performance level with ATI, with Nvidia it's not guaranteed but I believe it will be quite acceptable in most situations. With Nvidia I am guaranteed future compatibilty, but i'm not guaranteed future performance along with it.

So do I go with overall better performance now or future compatibility? I cannot decide for sure yet, but i'm leaning towards ATI once again.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
When the only person in this thread who actually owns an x800 says they want a 6800, that pretty much puts to rest any questions I had about getting the x800...and frankly until we see a real benchmark from the released game, please quit talking about HL2...I probably won't even get that game until the HL2/Duke Nukem Forever Double Pack is released anyway...:laugh:
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: GeneralGrievous
small performance advantage
Huh? The XT beats the Pro by at least 25% in almost every benchmark.

well, first, your statement is not accurate. it only approaches that kind of difference at the highest resolutions/aa/af.

secondly, read what i wrote: "and to me the small performance advantage it holds over the PRO doesn't justify the $100 premium in price, especially considering the overclockability in the PRO and the lack of it in the XT"

i run far cry with all settings max.. w/ 4xaa (arguably not needed - 2x works fine for texture aliasing at that res) and max af:

Average FPS: 44.02 Min FPS: 34.97 Max FPS: 52.69

most benchmarks show the xt running it around 48-50 avg fps with those settings.. why on earth would i want to spend $100+ for another 5-6fps? i'll pocket the $100 and put it towards an nv40. whichever i prefer, the other will end up in my second PC.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: ZobarStyl
When the only person in this thread who actually owns an x800 says they want a 6800, that pretty much puts to rest any questions I had about getting the x800...and frankly until we see a real benchmark from the released game, please quit talking about HL2...I probably won't even get that game until the HL2/Duke Nukem Forever Double Pack is released anyway...:laugh:

well, don't get me wrong.. the x800 is a wonderful card, and i have no regrets that i bought one. like i said, it's the fastest card you can get at the moment.

i'm not going to buy a 6800 because it's better per se, rather because it is comparable in performance and in my opinion it has more potential - but potential doesn't mean anything if it's not realized.