I'm currently beta testing a new game coming out somewhere around June-September, and am surprised by the difference in performance between people with systems like yours versus mine. It is a first person shooter, but I'm under a NDA agreement so that's all I'll say.
The specs below are copied and pasted from signature system specs on the beta forum that users cannot edit, and the fps taken from a thread on performance :
Windows XP Pro SP2 AMD Athlon 64 X2 Dual Core Processor 4400+ 2048MB RAM DirectX 9.0c
NVIDIA GeForce 7900 GTX GeForce 7900 GTX 512.0 MB 6.14.0010.9371
5.12.0001.1187
User running at 1280x1024 medium settings, dynamic lights off
Frames per second : 20-50 fps (in battle semi-full server to empty server fps)
While my frames per second, with my specs copied from the same thread :
Windows XP Pro SP2 Intel Core2 CPU 6600 @ 2.40GHz (2 CPUs) 2046MB RAM DirectX 9.0c
NVIDIA GeForce 8800 GTS GeForce 8800 GTS 320.0 MB 6.14.0010.9792
5.10.0000.5286
Running at 1600x1200 all ultra settings and dynamic lighting, but no AA
Frames per second : 50-200+ (full server major battle to empty server)
As you can see, I get much better performance even at higher settings. This probably has alot to do with my 8800 over the 7900 series used by the other player, but exactly how much is hard to say. I tried to find a X2 user with a 8800 in that thread, but there wasn't one. However, I'll list this last spec :
Windows XP Pro SP2 AMD Athlon 64 Processor 4000+ 2048MB RAM DirectX 9.0c
NVIDIA GeForce 8800 GTS GeForce 8800 GTS 320.0 MB 6.14.0011.5819
5.12.0001.1196
User running 1024x768 medium settings, no dynamic lights
Frames per second : 30-55 (half full server to empty)
I'm just bored, so thought I'd post this for reference. The game is in beta, and they haven't started optimizing much yet, so the results may be much different later.