Originally posted by: keysplayr2003
Originally posted by: apoppin
Originally posted by: lavaheadache
I've never heard of anybody using they're GTX at 1024x768!!! tell me you have a 14 inch lcd or something, if you do save that money you wanted to spend on the cpu and get a monitor
look at the benchs . . . CoD2 IS DEMANDING at 10x7 with EVERYthing on and MAX and with AA/AF . . . to run at HIGH res, you NEED sli [period]
ask
Rollo
:laugh:
seriously
:Q
Yeah, 1024x768 is really as high as I can go with max settings comfortably. If I go up one res to 1280x1024, I have to start lowering quality settings and I really do not wish to do that. Even though my framerate drops to 37fps, it is only momentary and zips right back up again. Problem is, that is usually the point at which I get fragged in multiplayer. Not all the time, but that is usually when it happens. Thats why I was wondering about the AMD platform. Even if it increased my minimum framerate to 50 that would be sufficient.
I always hear AMD fans saying how poorly Intel CPU's are for gaming and how so vastly superior AMD equivalents are. Judging from your comments apoppin, are they just greatly exaggerating? I am not very experienced in overclocking although I do believe I have the RAM for it. Geil DDR500. Not sure. I tried to do a small 225MHz o/c (increased the FSB to 215) but upon reboot, would default back to the 200MHz FSB. I don't know if the o/c failed and the bios automatically reverts to default.
As for SLI, well, I didn't want to go there (although I would love it) because I would still have to change my platform to AMD/sli mobo PLUS another GTX. "ouch" $$$$$.
So I guess switching to AMD as others have suggested would not provide the performance increase I am looking for?
And thanks for all the responses gents!