Can I expect a MUCH better framerate in CoD2?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

daveybrat

Elite Member
Super Moderator
Jan 31, 2000
5,804
1,015
126
Keys, the only game i play is COD2. And for the people who have never played the game in Multiplayer mode, you don't know what damage the game can do to your card. The game runs fine in DX9 mode for single player which i don't play. I only play Multiplayer, and absolutely cannot play in DX9 mode at any resolution. Try playing a 32 man server with smoke grenades and watch the slide show ensue. I run DX7 at 1280x1024 w/2XAA and high details, and it runs silky smooth. Sacrificing some image quality for frags is worth it!

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
Originally posted by: apoppin
Originally posted by: lavaheadache
I've never heard of anybody using they're GTX at 1024x768!!! tell me you have a 14 inch lcd or something, if you do save that money you wanted to spend on the cpu and get a monitor

look at the benchs . . . CoD2 IS DEMANDING at 10x7 with EVERYthing on and MAX and with AA/AF . . . to run at HIGH res, you NEED sli [period]

ask Rollo ;)




:laugh:

seriously
:Q

Yeah, 1024x768 is really as high as I can go with max settings comfortably. If I go up one res to 1280x1024, I have to start lowering quality settings and I really do not wish to do that. Even though my framerate drops to 37fps, it is only momentary and zips right back up again. Problem is, that is usually the point at which I get fragged in multiplayer. Not all the time, but that is usually when it happens. Thats why I was wondering about the AMD platform. Even if it increased my minimum framerate to 50 that would be sufficient.

I always hear AMD fans saying how poorly Intel CPU's are for gaming and how so vastly superior AMD equivalents are. Judging from your comments apoppin, are they just greatly exaggerating? I am not very experienced in overclocking although I do believe I have the RAM for it. Geil DDR500. Not sure. I tried to do a small 225MHz o/c (increased the FSB to 215) but upon reboot, would default back to the 200MHz FSB. I don't know if the o/c failed and the bios automatically reverts to default.

As for SLI, well, I didn't want to go there (although I would love it) because I would still have to change my platform to AMD/sli mobo PLUS another GTX. "ouch" $$$$$.

So I guess switching to AMD as others have suggested would not provide the performance increase I am looking for?

And thanks for all the responses gents!

sorry i'm late . . . i have to work :p

;)

yeah, as RussianSensation showed there is relatively little difference [compared to getting SLI] between CPUs after a certain point [~3Ghz for intel].

TRY O/C'ing . . . if you can get your CPU up to 3.5Ghz it will be more-or-less-equivalent to an A64 3400+ . . . except in gaming where it's easily the "equal" of a stock A64 3200.

i can easily O/C my CPU from 2.8 to 3.0 to 3.3 with a very modest voltage increase and to 3.5Ghz with a solid one. Most of the time i just leave it at 3.0Ghz because there isn't much practical difference - yet.

i'd say hold on till g71 ;)
[drop that AA down a bit]

i play at identical settings to yours EXCEPT 0aa/4xAF and it is smooth - except for that very occasional annoying 'hitch' . . . the analysis shows 33FPS as the minimum but i can see the ocasional hesitation. Damned annoying.

i tried 11x8 but have to lower a few settings to make it playable . . . will stay at 10x7

btw, i HAD to get FEAR today . . . i can't bring myself to play an old game over now that i have a new videocard . . . PM'd CC to BB and got it for $32.50
:thumbsup:

nice gfx and atmosphere