Originally posted by: AnnoyedGrunt
Originally posted by: 413xram
The real measuring stick in performance seems to be the game Farcry. Their is a huge differance in high resolution playability, 1600x1200 for ATI vs. 1024x768 for Nvidia. This to me is a huge kick in the groin for Nvidia. I do not know about anybody else in here, but being able to play a game at max resolution and settings has been a dream forever it seems and now it is finally here.
Do not even attempt to call me a fanboy, my last 3 video cards have been Nvidia chipsets. My fanboy crap ended with 3dfx (RIP). Nvidia better get in gear very quickly. I'm sorry but I'm not going to pay approx. $800 for some watercooled video card to play games with all the bells and whistles turned on when I can do it for half that price and no water and extra heat.
Well, here is one test that should ease your concerns.
http://www.anandtech.com/video/showdoc.html?i=2044&p=11
The difference between the X800 and 6800 is much, much smaller than you imply. Unless you don't consider 74 FPS "playabe" (the framerate of the 6800GT @ 1280x1024, which is higher than your indicated 1024x768 above). In fact, the 6800GT is just barely faster than the X800pro @ 1280x1024 with AA and AF enabled (52 vs. 50)
Anyhow, since you aren't a fanboy I'm sure you will be happy to hear that you will have many choices for video cards in the next several months, all of which perform very well in the newest games. I agree that the purchasing decision will come down to price for many of us, since the cards are so close in performance.
-D'oh!