*sigh*
Poor guys. You do realize that games won't be able to take advantage of DirectX 9.x until, I'd say July of next year at the earliest. Hell, games that take advantage of DX8.x are just starting to trickle out. There's really no point in spending ~$400 on a video card that has technology that won't be taken advantage of until a year later (in games). It's pointless.
The only point that I can see to picking up one of these 2 cards is for bragging rights. That includes benchmarks too. I mean, come on, do people really care that much that you can get more than 20,000 3DMarks in 3DMark 2k1?? Are the other players in that CS pickup game really that cynical when even they can't tell what framerate (over 100) that poor DX5/6 game is running at???
Hell, UT2k3 is basically a DX7 game with DX8.x effects thrown in here and there to make it look more attractive. Look at America's Army. The game looks amazing, and it was just released. The game is based on the Unreal Engine (same version that's being used in UT2k3), and it's only using DX7 effects (it
requires a GeForce2 card to be played at playable framerates.. ~30).
In the end, it's not worth the money unless you run 3DMark 24/7. Hell, the human eye can barely tell the difference between 60 and 100fps.. do you really think your eyes are going to care if you're running the game at 300fps??
Also: Doom3 (which is actually just a remake of Doom1, with a slightly different story line... it's not a continuation of the 'saga') is only a DX8.x generation game, and it won't be released until Xmas, if that. WarCraft 3, possibly the BEST RTS yet, is only using DX7 'technology,' again, if that. It may even be using DX6 as a baseline..
And just to reiterate one of my many points: until at least this time next year, the only "games" that you'll be able to play that use DX9 effects (Vertex/Pixel Shader 2.0) will be 3DMark2k2/3 and whichever tech demos ATI and nVidia release.
But until that time comes, I'm going to buy a Radeon 9700, and go render Final Fantasy: The Spirits Within in real time at 2fps.. ;P
Oh yes, and from what I gather.. the NV30 will only be able to render it at ~3fps in real time
Unfortunately, the NV30 is not the Holy Grail that nVidia is trying to make it appear as..
The only thing that will differentiate the NV30 from the Radeon 9700 (in terms of hardware capabilities) is the core/memory frequencies, and the ability for the NV30 to execute an ungodly number of instructions. But answer me this: what good is it if you can only execute 65k-instruction vertex shader at 3fps???
Not exactly worth the 3-month delay, IMO. You're better off going with a Radeon 9700 or something DX8-ish. Ti 4x00, Radeon 8500, Trident XP4, etc..
-Tarmax
And please, no flames. I didn't type all that just so some 12 year old who knows nothing about this industry can tell me I'm an idiot. Talk about the pot calling the kettle black.. =P
If you disagree with my opinion, well, that's nice. I respect your decision, as I expect the same from you.
We're all entitled to our own opinions.
And lastly, thank you for taking your time to read this ungodly long post
I'm sorry if I've wasted your time