I have been looking at some benchmarks around the net, and it looks like the GTX is a good deal faster than the GTS. However, I haven't been able to find really any benchmarks that are close to my setup:
C2D e6400 (~3GHz o/c)
eVga GeForce 8800 GTS 640MB (660/1800 o/c)
2GB DDR2-800 RAM (Corsair XMS 5-5-5-15)
Gigabyte GA-965P-S3
Windows XP Pro / Vista Ultimate
I'm only interested in playing @ 1680x1050 resolution. The games I want to play are:
Unreal Tournament 3
Crysis
Assassin's Creed (supposed to come out early next year)
I've noticed that anti-aliasing hurts my performance too much. I hate jaggies, and even at 1680x1050 I need some AA in certain places. But if I turn AA on (even 2x), I can't make it to my goal of 60+ fps at all times (in bioshock and ut3 demos). I realize I probably can't MAX OUT all settings in these games unless I want to upgrade everything, but I want to play at a resonably high detail. So I'm wondering this:
Given my CPU and the resolution and the fact that I want to use AA, will selling my GTS and getting a GTX allow me to reach my goal of 60+ fps at all times (particularly in UT3)? The reason I ask is I'm thinking my CPU might limit the GTX to the point that I won't see much of a boost. I just don't know, because I haven't really seen comparison between the C2D e6400 and those two cards (GTS/GTX). I believe my e6400 @ ~3GHz is comparable to an e6550, since AT says 20% more clocks makes up for a 2MB L2 cache deficit. But I still haven't seen even bechmarks with the e6550/1650x1050 combination. I was hoping maybe some of you could shed some light on this for me.
I realize the 8800GT is coming out, and leaked benchies make it appear to be quite a monster card for the price. I just have a problem with the fact that it is only 512MB. Maybe 512MB would be more than I'll need if I don't go higher than 1680x1050. What do you think?
Anyway, I might be rambling a bit so I'll leave it at that. Thanks in advance for the help.
C2D e6400 (~3GHz o/c)
eVga GeForce 8800 GTS 640MB (660/1800 o/c)
2GB DDR2-800 RAM (Corsair XMS 5-5-5-15)
Gigabyte GA-965P-S3
Windows XP Pro / Vista Ultimate
I'm only interested in playing @ 1680x1050 resolution. The games I want to play are:
Unreal Tournament 3
Crysis
Assassin's Creed (supposed to come out early next year)
I've noticed that anti-aliasing hurts my performance too much. I hate jaggies, and even at 1680x1050 I need some AA in certain places. But if I turn AA on (even 2x), I can't make it to my goal of 60+ fps at all times (in bioshock and ut3 demos). I realize I probably can't MAX OUT all settings in these games unless I want to upgrade everything, but I want to play at a resonably high detail. So I'm wondering this:
Given my CPU and the resolution and the fact that I want to use AA, will selling my GTS and getting a GTX allow me to reach my goal of 60+ fps at all times (particularly in UT3)? The reason I ask is I'm thinking my CPU might limit the GTX to the point that I won't see much of a boost. I just don't know, because I haven't really seen comparison between the C2D e6400 and those two cards (GTS/GTX). I believe my e6400 @ ~3GHz is comparable to an e6550, since AT says 20% more clocks makes up for a 2MB L2 cache deficit. But I still haven't seen even bechmarks with the e6550/1650x1050 combination. I was hoping maybe some of you could shed some light on this for me.
I realize the 8800GT is coming out, and leaked benchies make it appear to be quite a monster card for the price. I just have a problem with the fact that it is only 512MB. Maybe 512MB would be more than I'll need if I don't go higher than 1680x1050. What do you think?
Anyway, I might be rambling a bit so I'll leave it at that. Thanks in advance for the help.