Originally posted by: DAPUNISHER
As to the 6800, only rabid idiotic fanbois don't realize that it's a good value. You buy it for $283 shipped, sell Farcry and make $20 after shipping it off and have a $263 investment.
Hi. Rabid idiotic fanboy here.
Now, a 6800 including Far Cry for $283 shipped is a significantly better deal than a 6800 *not* including Far Cry at $300. But I'd still rather buy a 6800GT for $350 than a 6800NU for $260 (both 15% under MSRP). Of course, the 6800NU has the big advantage of being available. And nobody buys computer hardware as an 'investment' -- it's worthless within just a few years.
Then there is the programmable video processor that hasn't even begun to be used yet
Useful -- maybe -- only if you do a lot of video encoding. I suppose it might lower CPU loads for video decoding as well, but this is rarely an issue these days (except possibly with DVD-HD content like those super-high-res WMV vids you can download, but I think we're quite a ways away from that being mainstream).
new games supporting the 6800's feature set coming,
A new feature set of dubious value on cards of this generation. Considering that 90+% of the market won't even HAVE an SM3.0 video card, I find it unlikely that it will turn out to have a dramatic impact in the next 6-12 months. And even then, its performance benefits on the sorts of shaders being used today are pretty limited.
and very likely more performance to be found through new drivers.
Yes, but how much more? It's unlikely to even reach the levels that a 6800GT is at today.
On the other hand a 128mb 9800p gets older every day,
While I suppose this statement is trivially true, the 9800Pro still has a very acceptable level of performance (even in Doom3), and will likely be an excellent card for HL2. And it's not impossible for its performance (especially in OpenGL, if ATI ever gets around to rewriting their drivers) to improve via driver updates as well.
doesn't get the sweet brilinear feature,
? I don't normally see reduced filtering quality touted as a "sweet feature".
Yes. It also has comparable or better price/performance than the 6800NU, although at $283 and with Far Cry bundled it's getting close to matching the p/p of the 9800Pro.
For me the final analysis is that the 9800p is well past it's prime while the 6800 hasn't even reached it's.
You're looking for "its", not "it's". "It's" is short for "it is"; "its" is the possessive. The problem with the 6800, though, is that it's pretty badly crippled compared to the 6800GT (although it has the advantage of actually being available.) And there are a *lot* of people out there with far worse cards than even a 9800Pro, and for many of them, $200-250 is a lot to spend on a graphics card.
And as far as your actual (original) thread topic:
It's interesting. Similar to the way that some motherboard makers intentionally OC their FSB by a few percent. Raises the same sort of issues -- is that truthful? How do you bench it? Should you compare clock-for-clock, or both at their stock settings (since that's how most users will run)?