Originally posted by: 413xram
By the way. I agree totally with you on your stance with EA. They ruined my online game, Air WarriorIII. Jerk offs!
Originally posted by: Jeff7181
Originally posted by: Pete
Have we seen displacement mapping featured in any upcoming game? I'm curious now.
Far Cry "can" support it with a patch I believe. There's a video of a person playing Far Cry at what I think is an "nVidia booth" at some kind of store... so it is possible... but I don't think if you buy a 6800 right now it'll do displacement mapping in Far Cry. Far Cry needs a patch I believe, and possibly a new driver set for the 6800.
Originally posted by: SilverTrine
Yeah eat the pr hook line and sinker and ignore the facts of the situation: Anand among others have pointed out that the 6800u is the old style gpu design and will be far inferior in pixel shader games.
See what neophytes dont understand is that Sm3.0 is just a featureset, the X800 architecture is designed to do well in pixel shader games from the ground up the 6800u is not.
If people buy the 6800 because they think the Sm3.0 featureset will give them greater performance in pixel shader games they'll be sorely upset when halflife 2 comes out and the x800 gets 25-40% better performance.
The x800xt will be much faster than the 6800u in halflife 2 at the same level of detail.
See what neophytes dont understand is that Sm3.0 is just a featureset, the X800 architecture is designed to do well in pixel shader games from the ground up the 6800u is not
Originally posted by: Jeff7181
This whole debate started with me pointing out how silly it is for ATI owners to say how useless SM 3.0 is...
Facts:
- There are no current games that use SM 3.0 at the moment.
Current hardware is not fast enough to fully utilize ALL of SM 3.0's features to their full potential.
Two years ago there were no games that used SM 2.0.
- Two years ago current hardware was not fast enough to utilize ALL of SM 2.0's features to their full potential.
Two years ago, it was a big deal that ATI supported a technology that was useless on the current hardware... yet somehow today, when it's nVidia that's supporting a technology that may prove to be useless on current hardware, it's no big deal.
Originally posted by: ChkSix
Again, if it were soo useless, ATi would not be looking to implement it in the R500. Go figure. This might come out as a 'fanboy' comment, but it looks to me that ATI had to first see it implemented correctly so that they can go ahead and 'dissect' the competitor's card to figure out how to do it themselves. By the way, this is common practice by every business around the globe. Do not think there is an exception here.
Truth of the matter is that they couldnt implement 3.0 into the R420 because it simply does not have the power to do it.
Six different clock speeds on the same card given to different reviewers says what again to you? A coincidence or an attempt to take a lead through a simple modification?
Originally posted by: SilverTrine
... and say Anand believes with driver optimizations the 6800 will gain a clear advantage over the x800.
You just proved yourself a fanboi more intent on spreading FUD than debating facts. The x800xt will be much faster than the 6800u in halflife 2 at the same level of detail.
I'm willing to put my money where my mouth is and make a paypal bet regarding that, you are not because you know you're wrong. If you do decide to put your money where your mouth is I'll be happy to paypal money to an honest long-term member and you do the same who will hold it upon release of Anands Halflife 2 benchmarks.
link to threadOriginally posted by: SilverTrine
...Theres no problems with Ati cards in OpenGl games, games like RTCW actually ran better on Ati cards than Nvidia cards. So the person who wrote this is simply wrong unless he clarifies his statement intensively...
I won't bet any money on it cause I'm poor and would rather spend the little money I do have on more useful things. As for being wrong... you obviously misinterpreted that statement.
This whole debate started with me pointing out how silly it is for ATI owners to say how useless SM 3.0 is...
Facts:
There are no current games that use SM 3.0 at the moment.
Current hardware is not fast enough to fully utilize ALL of SM 3.0's features to their full potential.
Two years ago there were no games that used SM 2.0.
Two years ago current hardware was not fast enough to utilize ALL of SM 2.0's features to their full potential.
Two years ago, it was a big deal that ATI supported a technology that was useless on the current hardware... yet somehow today, when it's nVidia that's supporting a technology that may prove to be useless on current hardware, it's no big deal.
They figured out how to implement PS2.0 well before Nvidia did, didn't they? It's more like they chose not to for cost/performance reasons.This might come out as a 'fanboy' comment, but it looks to me that ATI had to first see it implemented correctly so that they can go ahead and 'dissect' the competitor's card to figure out how to do it themselves.
I disagree with this statement. The RADEON 9700Pro/9800/9800Pro and NVIDIA 5900 cards, at least, have enough horsepower to run Far Cry (a game which uses SM2.0 shaders moderately), and are supposed to be good enough to run Doom3, HL2, and similar games making heavy use of SM2.0 (although perhaps not at super-high resolutions and with all the quality features cranked up). But this is always the case with new features -- you could say that the GeForce4 and R8500 weren't really fast enough to use all of DX8.1's features to their full potential, either.
Originally posted by: GeneralGrievous
Well, I did some math and compared the X800XT to the 6800 GT (the best price/performance cards), averaging out all the benchmarks.
On anandtech, the X800XT averaged 17% faster performance for a 20% cost premium.
On toms hardware the X800XT had a 28% lead.
On xbitlabs the X800XT had a 26% lead.
So it does seem that the X800XT offers slightly better price/performance, or slightly worse, depending on who you believe. The kicker that throws me in Nvidia's direction currently is that the GT can be overclocked at the XT supposedly cannot. The question is whether any potential benefit later on (SM 3.0, etc, etc), while only being potential at best now, is worth giving up 5-10% price/performance on. I agree with you: buying a GT and overclocking it gives enough performance now to be the better buy than the XT.
The thing is, the cards are so equal this time around that even the smallest change can sway me 1 way or another. But if the X800 XT comes down to $450 again, I'd buy it in a heartbeat.
You might want to clear up your example though. 15 fps out of 80 is pretty big: the gap in tests was far closer.
Originally posted by: nitromullet
Originally posted by: GeneralGrievous
Well, I did some math and compared the X800XT to the 6800 GT (the best price/performance cards), averaging out all the benchmarks.
On anandtech, the X800XT averaged 17% faster performance for a 20% cost premium.
On toms hardware the X800XT had a 28% lead.
On xbitlabs the X800XT had a 26% lead.
So it does seem that the X800XT offers slightly better price/performance, or slightly worse, depending on who you believe. The kicker that throws me in Nvidia's direction currently is that the GT can be overclocked at the XT supposedly cannot. The question is whether any potential benefit later on (SM 3.0, etc, etc), while only being potential at best now, is worth giving up 5-10% price/performance on. I agree with you: buying a GT and overclocking it gives enough performance now to be the better buy than the XT.
The thing is, the cards are so equal this time around that even the smallest change can sway me 1 way or another. But if the X800 XT comes down to $450 again, I'd buy it in a heartbeat.
You might want to clear up your example though. 15 fps out of 80 is pretty big: the gap in tests was far closer.
How much does a GT cost?
Originally posted by: Jeff7181
Originally posted by: nitromullet
Originally posted by: GeneralGrievous
Well, I did some math and compared the X800XT to the 6800 GT (the best price/performance cards), averaging out all the benchmarks.
On anandtech, the X800XT averaged 17% faster performance for a 20% cost premium.
On toms hardware the X800XT had a 28% lead.
On xbitlabs the X800XT had a 26% lead.
So it does seem that the X800XT offers slightly better price/performance, or slightly worse, depending on who you believe. The kicker that throws me in Nvidia's direction currently is that the GT can be overclocked at the XT supposedly cannot. The question is whether any potential benefit later on (SM 3.0, etc, etc), while only being potential at best now, is worth giving up 5-10% price/performance on. I agree with you: buying a GT and overclocking it gives enough performance now to be the better buy than the XT.
The thing is, the cards are so equal this time around that even the smallest change can sway me 1 way or another. But if the X800 XT comes down to $450 again, I'd buy it in a heartbeat.
You might want to clear up your example though. 15 fps out of 80 is pretty big: the gap in tests was far closer.
How much does a GT cost?
We'll let you know when it's available for purchase![]()
Originally posted by: nitromullet
Originally posted by: Jeff7181
Originally posted by: nitromullet
Originally posted by: GeneralGrievous
Well, I did some math and compared the X800XT to the 6800 GT (the best price/performance cards), averaging out all the benchmarks.
On anandtech, the X800XT averaged 17% faster performance for a 20% cost premium.
On toms hardware the X800XT had a 28% lead.
On xbitlabs the X800XT had a 26% lead.
So it does seem that the X800XT offers slightly better price/performance, or slightly worse, depending on who you believe. The kicker that throws me in Nvidia's direction currently is that the GT can be overclocked at the XT supposedly cannot. The question is whether any potential benefit later on (SM 3.0, etc, etc), while only being potential at best now, is worth giving up 5-10% price/performance on. I agree with you: buying a GT and overclocking it gives enough performance now to be the better buy than the XT.
The thing is, the cards are so equal this time around that even the smallest change can sway me 1 way or another. But if the X800 XT comes down to $450 again, I'd buy it in a heartbeat.
You might want to clear up your example though. 15 fps out of 80 is pretty big: the gap in tests was far closer.
How much does a GT cost?
We'll let you know when it's available for purchase![]()
That was my point exactly, you can't do a price/performance comparison if you don't know how much one of the items being compared costs.
Huh? The XT beats the Pro by at least 25% in almost every benchmark.small performance advantage
Originally posted by: GeneralGrievous
Huh? The XT beats the Pro by at least 25% in almost every benchmark.small performance advantage
Originally posted by: ZobarStyl
When the only person in this thread who actually owns an x800 says they want a 6800, that pretty much puts to rest any questions I had about getting the x800...and frankly until we see a real benchmark from the released game, please quit talking about HL2...I probably won't even get that game until the HL2/Duke Nukem Forever Double Pack is released anyway...:laugh:
