• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

5 year old game engine, and frames/sec occasionally in the 30s. How is this even possible?

oLLie

Diamond Member
Computer A:
KT333 chipset
AMD XP2000+
WinXP Pro w/ SP1a
40Gb 7200rpm hdd
1024 MB PC2700 DDR
Geforce4 Ti4200 (53.03 drivers)

Computer B:
KT333 chipset
AMD XP1800+
WinXP Pro w/ SP1a
40Gb 7200rpm hdd
512 MB PC2700 DDR
Radeon 9500 non-pro softmodded (but not overclocked) to enable 4 extra pipelines (Cat 3.10 drivers)

If I'm playing Day of Defeat or Natural Selection at 1024x768/32bpp color, in scenes with lots of action, my frames/sec will dip down into the 30s on either computer. At any other time the frames/sec will generally be stuck at whatever fps_max is set to. How can this be? Both mods are based on a game that is more than five years old (Half-Life, in case judokno).
 
What video mode are you running it in?

Edit: If running in direct3d, switch to opengl. If running in Opengl, switch to direct3d.
 
what processor are you using? half-life and its mods are notorious processor hogs so if you are running something under 2 GHz you will experience dips, even with radeon 9800s and gffx5900s.
 
Originally posted by: hdeck
what processor are you using? half-life and its mods are notorious processor hogs so if you are running something under 2 GHz you will experience dips, even with radeon 9800s and gffx5900s.

Dips down in the 30's? I don't know about that.

Edit: Well, I suppose if the cpu is really old then he'll go that low. But if they are fairly new/fast, he shouldn't be experiencing those kinds of fps.

Edit: And if his cpu's aren't pretty danged fast, he wasted his money on the rest of the system.
 
vsync? if that is enabled, wont it lock to the refresh rate of your monitor or less? try disabling that. i do NOT play halflife, so i dont know if it is in the game. try the properties of the video card somewhere.
 
Originally posted by: CubicZirconia
What video mode are you running it in?

Edit: If running in direct3d, switch to opengl. If running in Opengl, switch to direct3d.

As I understand it, Half-Life was meant to be run in OpenGL. Hence, I'm running it in OpenGL.


Originally posted by: hdeck
what processor are you using?

Gah! I knew I forgot something in the original post! I had the CPU speeds, then I lost the post and had to type it back up. Guess I forgot the CPU the second time around.

Originally posted by: rainypickles
vsync? if that is enabled, wont it lock to the refresh rate of your monitor or less? try disabling that. i do NOT play halflife, so i dont know if it is in the game. try the properties of the video card somewhere.

I don't care that their is a ceiling on my fps (Vsync is on); I do care that my fps are dipping down so low.

Originally posted by: clicknext
How is your performance in comparison in some of the prettier new 3D games?

I don't really have any other games to test it on. I've tried Warcraft III on Computer A, and at 1024x768 it's smooth unless I am playing a large map with 8+ armies on the screen. On Computer B, I tried Worms3D recently and didn't notice any real slowdown (I think resolution was at 1024x768 again).
 
Ok, so the games are based on half Life, a 5 year old game. They're more recent than half Life though so it would be reasonable to assume that the development team would not design the game to run on the same hardware as Half Life was, they would design it to run on more modern hardware. The point is, don't expect performance in a more recent game based on Half Life to be exactly like or even similar to what kind of performance you get in Half Life.

Originally posted by: oLLie

As I understand it, Half-Life was meant to be run in OpenGL. Hence, I'm running it in OpenGL.
From what i've heard, Cats are better for D3D than OGL so maybe running it in D3D will yield better performance?
 
Originally posted by: Ulukai
Ok, so the games are based on half Life, a 5 year old game. They're more recent than half Life though so it would be reasonable to assume that the development team would not design the game to run on the same hardware as Half Life was, they would design it to run on more modern hardware. The point is, don't expect performance in a more recent game based on Half Life to be exactly like or even similar to what kind of performance you get in Half Life.

Originally posted by: oLLie

As I understand it, Half-Life was meant to be run in OpenGL. Hence, I'm running it in OpenGL.
From what i've heard, Cats are better for D3D than OGL so maybe running it in D3D will yield better performance?

Well, I probably should not have said based on Half-Life; they are running on the Half-Life engine. Not a modified engine, the Half-Life engine.
 
Even using the old HL engine, the models they use probably have 10x the complexity (more polygons, more / bigger textures) and the mix of models in the levels was designed for modern CPUs. So the fact that the engine is 5 years old doesn't mean too much unless you benchmark it using the 5 year old install of HL (skipping even the hires texture pack).

Actually, a 5 year old engine might run less well than a modern engine given high-polygon / high-texture mdels, since that isn't what the engine was optimized for.

Also, do you have AA/AF enabled? Those cause a huge performance hit on the gf4.
 
Originally posted by: DaveSimmons
Even using the old HL engine, the models they use probably have 10x the complexity (more polygons, more / bigger textures) and the mix of models in the levels was designed for modern CPUs. So the fact that the engine is 5 years old doesn't mean too much unless you benchmark it using the 5 year old install of HL (skipping even the hires texture pack).

Actually, a 5 year old engine might run less well than a modern engine given high-polygon / high-texture mdels, since that isn't what the engine was optimized for.

Also, do you have AA/AF enabled? Those cause a huge performance hit on the gf4.

Nope, AA/AF are both set to application preference on both computers. I considered that both the mods might use more complex models (higher polygons) than Half-Life itself, but I sort of thought the number of "usable" polygons is something restricted by the engine.

Does everyone then feel that the fps I'm getting is for the most part normal, or not normal? It's not a constant 3X.X fps by any means, but it does dip down that low in complex scenes.

Originally posted by: Ulukai
Originally posted by: oLLie

As I understand it, Half-Life was meant to be run in OpenGL. Hence, I'm running it in OpenGL.
From what i've heard, Cats are better for D3D than OGL so maybe running it in D3D will yield better performance?

I tried it in Direct3D mode on both computers and it only seems worse.
 
I just don't think the HL engine is that great. I'm running on a Athlon 2.05GHz and Radeon 9500 Pro at 1280x1024 and I still get huge dips (down to like 30fps) when I see things like smoke and fog.
 
I don't care that their is a ceiling on my fps (Vsync is on); I do care that my fps are dipping down so low.

What is your refresh rate at? If you had it set to 60Hz(as an example) and your rig could push 50FPS in those instances the framerate would drop to 30FPS due to Vsync being enabled.
 
Originally posted by: BenSkywalker
I don't care that their is a ceiling on my fps (Vsync is on); I do care that my fps are dipping down so low.

What is your refresh rate at? If you had it set to 60Hz(as an example) and your rig could push 50FPS in those instances the framerate would drop to 30FPS due to Vsync being enabled.

Would it drop to exactly half of the refresh rate? Because my fps is never really exactly at 30 or half my refresh rate, rather it dips down into the mid-thirties and then goes back up to fps_max.

To answer your question though, I'm not sure what my refresh rate is set at. That is, I have it set to 85 Hz on both machines inside Windows, but it seems there's some discrepancy. Once I'm actually inside a game and playing, if I bring up the OSD on the monitor, it doesn't say 85 (I'm not sure, but I think it might say 60).

More Info:

Computer A (NEC MultiSync 97F)
Windows refresh rate: 85 Hz (1280x960)
Monitor OSD reports: Hor-sync: 85 Hz Ver-sync: 85 Hz
Hide modes that this monitor cannot support is checked.
Inside game: Hor-sync: 48 Hz Ver-sync: 60 Hz

Computer B (Samsung Syncmaster 955df)
Windows refresh rate: 85 Hz (1152x864)
Monitor OSD reports: 67.4 kHz 75 Hz PP (Don't know what the kHz number is, nor what PP is)
Hide modes that this monitor cannot support is checked.
Inside game: 48.3 kHz 60 Hz NN (Don't know what the kHz number is, nor what NN is)

 
Turn off Trufom in your control panel; some HL mods use it and it eats performance badly.

I don't care that their is a ceiling on my fps (Vsync is on); I do care that my fps are dipping down so low.
Vsync will influence your minimum framerate as well as your average and maximum framerate. Turn it off in your driver control panel.
 
Originally posted by: BFG10K
Turn off Trufom in your control panel; some HL mods use it and it eats performance badly.

I don't care that their is a ceiling on my fps (Vsync is on); I do care that my fps are dipping down so low.
Vsync will influence your minimum framerate as well as your average and maximum framerate. Turn it off in your driver control panel.

Truform is only for ATI cards.

I don't think it's abnormal for the game to dip down to around 30 fps occasionally, where you're looking at a large area with a lot of models. It should be at least 60fps usually, though.
 
again, the blame falls mainly on the processors, though the system with the ATi card is at a disadvantage. When there is lots of action (espeically in DoD) or smoke grenades around your FPS can and will drop that low. i have a barton 2500 and radeon 9500 and my game drops during firefights and on certain maps (like aztec for cs).
 
Originally posted by: BFG10K
Turn off Trufom in your control panel; some HL mods use it and it eats performance badly.

I don't care that their is a ceiling on my fps (Vsync is on); I do care that my fps are dipping down so low.
Vsync will influence your minimum framerate as well as your average and maximum framerate. Turn it off in your driver control panel.

Yep, Truform is set to Always Off. I was not aware that V-sync affected those things as well. Thanks for pointing it out, I will turn it off.

Originally posted by: clicknext
I don't think it's abnormal for the game to dip down to around 30 fps occasionally, where you're looking at a large area with a lot of models. It should be at least 60fps usually, though.

Yep, this is pretty much the situation.

Originally posted by: hdeck
again, the blame falls mainly on the processors, though the system with the ATi card is at a disadvantage. When there is lots of action (espeically in DoD) or smoke grenades around your FPS can and will drop that low. i have a barton 2500 and radeon 9500 and my game drops during firefights and on certain maps (like aztec for cs).

I agree; I find that it happens more often when I'm playing Day of Defeat and Natural Selection than when I play CS. I think Day of Defeat and Natural Selection make heavier use of textures than CS. Also, I almost never see any smoke grenades when I play CS.

Anyway, it seems that I'm not the only one having this happen. It sure does piss me off though... everyone is always saying how fast technology is improving and how powerful all this crap is, but I still hit 30s in a 5 year old game engine? Bah!

At least I finally found out why Computer A always looked crisper/sharper than Computer B. The maximum texture size for computer B (gl_max_size) was set to 256 and it was at 512 on Computer A. Small but noticeable difference but a very annoying problem to fix. I would set it to 512 in console, only to find that it reverted to 256 once I joined a game. I would add it to config.cfg, only to find that config.cfg removed it once I joined a game. Finally, I had to add an autoexec.cfg file inside the main directory of each mod with just a single line: gl_max_size 512.

I could not, and still can not, figure out why Computer B was set to 256 and Computer A was set to 512 because I don't remember ever changing gl_max_size on either machine until the past few days. I thought that the "texture quality" settings on Computer B might be set low, and that Half-Life read these low settings and decided to set gl_max_size to 256. However, the actual settings (Texture Preference and Mipmap Detail Level) are/were both at High Quality. Oh well.

Thanks everyone for your replies 😎
 
In the ATI Control Panel under the tab SMARTGART, the setting Fase Write is set to Off. Is that right?
 
Originally posted by: oLLie
At least I finally found out why Computer A always looked crisper/sharper than Computer B. The maximum texture size for computer B (gl_max_size) was set to 256 and it was at 512 on Computer A. Small but noticeable difference but a very annoying problem to fix. I would set it to 512 in console, only to find that it reverted to 256 once I joined a game. I would add it to config.cfg, only to find that config.cfg removed it once I joined a game. Finally, I had to add an autoexec.cfg file inside the main directory of each mod with just a single line: gl_max_size 512.

Is your config set to read only? That may be why it wasn't saving the changes from console.
 
Originally posted by: WobbleWobble
Originally posted by: oLLie
At least I finally found out why Computer A always looked crisper/sharper than Computer B. The maximum texture size for computer B (gl_max_size) was set to 256 and it was at 512 on Computer A. Small but noticeable difference but a very annoying problem to fix. I would set it to 512 in console, only to find that it reverted to 256 once I joined a game. I would add it to config.cfg, only to find that config.cfg removed it once I joined a game. Finally, I had to add an autoexec.cfg file inside the main directory of each mod with just a single line: gl_max_size 512.

Is your config set to read only? That may be why it wasn't saving the changes from console.

No it's not. Also, I believe if it was, it wouldn't even let me click Save after altering the file.
 
I have no idea why, but my Radeon 8500 performed better than my current 9600 Pro in CS. On my 8500 I would be maxed out at 100fps normally and usually dont go below 50 at busy spots or with some smoke. With my 9600 Pro I dip down to around 40, and fps usually stays about 70-80.
 
Back
Top