Can I expect a MUCH better framerate in CoD2?

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I was just wondering if it would be worth it at this point to switch from my P43.0E, to an AMD 3500+. Would I notice a substantial increase in framerates? Or would the improvement be not worth the cost of this upgrade?

When I say worth it, I mean will I see a 25-30 fps increase across the board in CoD2?


Current setup:
Pentium 4 3.0E @ stock
Albatron PX915P4C (skt478 -w- PCI-e. Intel 915 Chipset)
1GB Geil DDR500 in DC 2x512
SB Live
eVGA 7800GTX
Windows XP SP1
Forceware 81.85 drivers
Patch 1.01 for Intel HT/DC

My settings for CoD2:
1024x768 4xAA 8xAF (in game settings)
DX9 mode
Everything set to highest
Textures on Auto
Averaging 85-120 fps just walking around with no action
As low as 37 fps when in heavy action.

If anyone needs more specific info from me, just lemme know.

Any suggestons, helpful benchies, etc. are much appreciated.

Thanks,
Keys
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
I was just wondering if it would be worth it at this point to switch from my P43.0E, to an AMD 3500+. Would I notice a substantial increase in framerates? Or would the improvement be not worth the cost of this upgrade?

When I say worth it, I mean will I see a 25-30 fps increase across the board in CoD2?

a 25-30 fps increase across the board in CoD2 just from changing CPUs?
:Q

you're kidding, right?

No

For those kind of results just get a 2nd 7800GTX

or . . .

you can try lowering your settings to 2x AA ;)
otoh, a "substantial" increase from upgrading your CPU maybe but not "double" :p

since you are considering a change, why not O/C the hell outta your CPU? a 3.6Ghz intel cpu is pretty powerful . . . if you fry it - unlikely unless you volt mod it] then change it :p

edited over and over
:confused:
[just like you like it . . . done]
:D
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
I've never heard of anybody using they're GTX at 1024x768!!! tell me you have a 14 inch lcd or something, if you do save that money you wanted to spend on the cpu and get a monitor
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: lavaheadache
I've never heard of anybody using they're GTX at 1024x768!!! tell me you have a 14 inch lcd or something, if you do save that money you wanted to spend on the cpu and get a monitor

look at the benchs . . . CoD2 IS DEMANDING at 10x7 with EVERYthing on and MAX and with AA/AF . . . to run at HIGH res, you NEED sli [period]

ask Rollo ;)




:laugh:

seriously
:Q
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: apoppin
Originally posted by: lavaheadache
I've never heard of anybody using they're GTX at 1024x768!!! tell me you have a 14 inch lcd or something, if you do save that money you wanted to spend on the cpu and get a monitor

look at the benchs . . . CoD2 IS DEMANDING at 10x7 with EVERYthing on and MAX and with AA/AF . . . to run at HIGH res, you NEED sli [period]

ask Rollo ;)




:laugh:

seriously
:Q

Yeah, 1024x768 is really as high as I can go with max settings comfortably. If I go up one res to 1280x1024, I have to start lowering quality settings and I really do not wish to do that. Even though my framerate drops to 37fps, it is only momentary and zips right back up again. Problem is, that is usually the point at which I get fragged in multiplayer. Not all the time, but that is usually when it happens. Thats why I was wondering about the AMD platform. Even if it increased my minimum framerate to 50 that would be sufficient.

I always hear AMD fans saying how poorly Intel CPU's are for gaming and how so vastly superior AMD equivalents are. Judging from your comments apoppin, are they just greatly exaggerating? I am not very experienced in overclocking although I do believe I have the RAM for it. Geil DDR500. Not sure. I tried to do a small 225MHz o/c (increased the FSB to 215) but upon reboot, would default back to the 200MHz FSB. I don't know if the o/c failed and the bios automatically reverts to default.

As for SLI, well, I didn't want to go there (although I would love it) because I would still have to change my platform to AMD/sli mobo PLUS another GTX. "ouch" $$$$$.

So I guess switching to AMD as others have suggested would not provide the performance increase I am looking for?

And thanks for all the responses gents!
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
I get very good frame rates at 1680x1050 everything high, noAA though. I would have to guess that I see an average close to 50+
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
If there were an in game demo I would run it and let you know. My system is pretty comparable to the one you mention building, and I run it beautiful
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: lavaheadache
If there were an in game demo I would run it and let you know. My system is pretty comparable to the one you mention building, and I run it beautiful

Lava, if it isn't too much trouble, I can send you a demo and you could play it at same settings as I have? Just to compare similar graphics settings against my P4 to your 3700+.

The demo would be very small. Less than 5MB. You would just need to place the demo file in your CoD2/Main/demo directory. Then, launch CoD2 and get into the main menu. Drop down your console with the ~ key. then type /demo <demoname> and it will run the demo I made.

Use fraps to record min avg max etc.

I would owe you one if you would do this for me.

Just lemme know. PM me.

 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
No way. COD2 has 0 cpu dependency whatesoever because it crushes graphics cards for fun before cpu even becomes a factor.

Graphics Dependency #1 - with ATI cards
- going from 4200+ 2.0ghz to 2.4ghz 4800+ literally doesnt even gain 1 framerate. 4200+ with even 1 step higher graphics card pummels 4800+ with a slower model.
Graphics Dependency #2 - with 7800GT - going from sempron 2800+ to FX57 or even P4 4.2ghz gains a whooping 4 frames per seconds and thats without 8AF enabled.
Graphics Dependency #3 - Varying A64 speed with 7800GTX at 1024x768 with noAA/AF, going from A64 1.6ghz to 2.8ghz A64 gained 2.6 frames.

Basically, no current cards can handle Call of Duty 2. You can either get xbox360, wait for R580/G71 or pick up a 7800GTX (highly unrecommended)
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: RussianSensation
No way. COD2 has 0 cpu dependency whatesoever because it crushes graphics cards for fun before cpu even becomes a factor.

Graphics Dependency #1 - with ATI cards">http://www.firingsquad.com/hardware/call_of_duty_2_dual-core/page4.asp</a> - going from 4200+ 2.0ghz to 2.4ghz 4800+ literally doesnt even gain 1 framerate. 4200+ with even 1 step higher graphics card pummels 4800+ with a slower model.
Graphics Dependency #2 - with 7800GT - going from sempron 2800+ to FX57 or even P4 4.2ghz gains a whooping 4 frames per seconds and thats without 8AF enabled.
Graphics Dependency #3 - Varying A64 speed with 7800GTX at 1024x768 with noAA/AF, going from A64 1.6ghz to 2.8ghz A64 gained 2.6 frames.

Basically, no current cards can handle Call of Duty 2. You can either get xbox360, wait for R580/G71 or pick up a 7800GTX (highly unrecommended)

Damn!! This is exactly what I was searching for. Thanks a ton Russian!! Lavaheadache, looks like I won't be needing that demo test after all. Unless you would want to try it anyway. Looks like I'll have to wait for G71/X1900. Fortunately, they are not too far away.

EDIT: LOL!!! Just for fun, I switched to DX7 mode to see what it was like. It looks just like CoD1 but running damn near 240fps average. LMAO.

 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Originally posted by: RussianSensation
No way. COD2 has 0 cpu dependency whatesoever because it crushes graphics cards for fun before cpu even becomes a factor.

Graphics Dependency #1 - with ATI cards">http://www.firingsquad.com/hardware/call_of_duty_2_dual-core/page4.asp</a> - going from 4200+ 2.0ghz to 2.4ghz 4800+ literally doesnt even gain 1 framerate. 4200+ with even 1 step higher graphics card pummels 4800+ with a slower model.
Graphics Dependency #2 - with 7800GT - going from sempron 2800+ to FX57 or even P4 4.2ghz gains a whooping 4 frames per seconds and thats without 8AF enabled.
Graphics Dependency #3 - Varying A64 speed with 7800GTX at 1024x768 with noAA/AF, going from A64 1.6ghz to 2.8ghz A64 gained 2.6 frames.

Basically, no current cards can handle Call of Duty 2. You can either get xbox360, wait for R580/G71 or pick up a 7800GTX (highly unrecommended)

What do you mean no current cards handle COD2. My sytem handles it absolutely fine. I played through the entire thing with no hiccups. I havent tried multiplayer but I'm not into mindless death matches.
 
Dec 3, 2005
143
0
0
Originally posted by: lavaheadache
Originally posted by: RussianSensation
Basically, no current cards can handle Call of Duty 2. You can either get xbox360, wait for R580/G71 or pick up a 7800GTX (highly unrecommended)

What do you mean no current cards handle COD2. My sytem handles it absolutely fine. I played through the entire thing with no hiccups. I havent tried multiplayer but I'm not into mindless death matches.

Yea, that guy is nuts. I have a 7800GT at stock speeds(470/1100) . I run CoD 2 at 12x10 with everything turned up, AA, EVERYTHING, and I haven't had any sort of lag or stuttering.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Hey guys I am not trying to argue with you I am just going with what benchmarks show from 3 very respectable websites. You cannot play COD2 on any single graphics card with DX9 and all graphics settings enabled and get 50-60 frames per second for resolutions high end cards are meant, ie. 1600x1200. And FEAR, lets not even start talking about that game. The point is today's top cards cannot play latest games at 1600x1200 4AA/16AF with softshadows and all highest settings enabled; thats why we need G71/R580. With all settings on, the performance hit is enormous - LINK
 
Dec 3, 2005
143
0
0
Originally posted by: RussianSensation
You cannot play COD2 on any single graphics card with DX9 and all graphics settings enabled and get 50-60 frames per second.

Well then anything higher than what I'm getting( whether that be 50,60, ect. FPS) is pointless on the game because whatever I'm getting is damn good :p
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: killershroom1985
Originally posted by: RussianSensation
You cannot play COD2 on any single graphics card with DX9 and all graphics settings enabled and get 50-60 frames per second.

Well then anything higher than what I'm getting( whether that be 50,60, ect. FPS) is pointless on the game because whatever I'm getting is damn good :p

I agree that gaming is subjective. I enjoyed goldeneye on N64 even when it dipped into 20s and below 20s during explosions. But keys probably loves smoothness, that is why he is only playing at 1024x768 4AA/16AF and I responded to his question while keeping that in mind.
 

JRW

Senior member
Jun 29, 2005
569
0
76
Im running COD2 on an X2 4800+ / single 7800GTX 512mb @ 1280x800@100hz (16:10 aspect crt monitor) with 4x AA / Anistropic enabled and max graphic settings, as long as I leave Vsync disabled it runs very nice (60fps and higher most the time) otherwise with Vsync enabled I get severe framerate drops due to the lack of Triple Buffering in this game.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
I just play this game in DX7 mode. The DX9 mode runs so badly that it's not worth dropping the resolution that much for the extra eye candy. I would rather play at 2048x1536 in DX7 and get the same framerates as around 800x600 in DX9.
 

JRW

Senior member
Jun 29, 2005
569
0
76
Originally posted by: CP5670
I just play this game in DX7 mode. The DX9 mode runs so badly that it's not worth dropping the resolution that much for the extra eye candy. I would rather play at 2048x1536 in DX7 and get the same framerates as around 800x600 in DX9.

Yes but at the expense of visual quality / detailed smoke effects etc. Theres a cod2 dx7 vs. dx9 thread somewhere (might not be this forum) which shows the differances.
 

Fern

Elite Member
Sep 30, 2003
26,907
174
106
Hi keys,

Do what Apoppin (Mark) says and try with 2xaa. Also, set "corpses" (or whatever it's called to adjust number of dead bodies visable) to medium (if thats relevant to MP - I just play the SP missions). With my "lowly" rig that lets me play at 10x7 with everything else at the same settings you use; I don't run the FPS in game but I haven't noticed any slowdowns. Course, 37 fps might not be noticeable to me.

But A just kills in this game AFAIK. Seen this tweak guide HERE ?

Fern

BTW: I agree with the above posters that a CPU change ain't worth it.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Hey Fern,

Yes I will try 2xAA & 2xQAA and see if there is a fps difference as well as a jaggie difference. (Thanks Apoppin).

Yes, I only played the Single players once all the way through. Once I was done with that and got a feel for the game, I went strictly MP. 37 fps does indeed sound very playable, but for some reason, it is not smooth at 37 as I can easily notice it when it drops there. There must be a "sweet spot" for this game somewhere around 45fps where the game becomes smooth and the mouse lag dissappears. By mouse lag, I mean the very slight delay when you move your mouse and the character changes position. It's only a few ms I suppose, but I notice it and it really is annoying to me. Nothing to do with network either because I run my own server on my own LAN at home and the same thing happens when the fps drops into the 30's.

CPU impact does not seem to be a factor with this game. I will stick with what I have until G71/X1900. But meanwhile, I will try what you all have suggested. I really appreciate the suggestions. And I will check out your tweak guide link, Thanks!!
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
keys just got home and that demo file is only 500K. Anyways to further discussion, somebody mentioned fear. I run fear all maxed noAA or soft shadows 2xAF and get 37 min fps and an ave of 55 at 1680 X1050 in the benchmark. I call this damn good because it looks great and runs very very well.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Yeah the demo is only a couple of minutes long at max. I would like to try and make another one on a highly populated server that has a snow map on it. The falling snow seems to really give a hit. I'll make one in a bit.
FEAR actually runs very nicely on my rig. I have forgotten the settings, cause I haven't played it for a few months, or at least since CoD2 came out. But I do remember 1280x960 worked nice with I believe 2xAA. I'll have to go back and check. CoD2 is definately harder on the hardware than FEAR is, SANS the softshadows in FEAR of course.
 

Fern

Elite Member
Sep 30, 2003
26,907
174
106
Hey lava,

If your gonna run the regular demo to compare with Keys system, prolly shouldn't bother. I guess to keep the demo file a "manageable" d/l it only comes with "medium" gfx IIRC. I.e., you can't play at his settings.

Fern
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
just wondering, do you guys know about a certain console command that fixes a little bug that will give about a 5-20 % increase with no visual difference. Let me be the one to hook you up. try this, load a map and get to a spot where there is no action going on( this is so you can achieve a steady fps) open the console and type r_multigpu 1 . Basically there is an issue within the system which doesnt optimize for single card systems. Once you hit enter you will see your fps jump up more than a few points with no IQ difference.