Successful Ti4200 overclock...no results???

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
Hmmm...I went from the default core/memory clock of 250/446MHz to 325/560MHz. winXP Pro starts up fine, and games play without artifacts. The interesting thing is that i'm getting the same framerates now as i was before the OC. WTF???
 

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
if Vsync is vertical sync, then yes its on by default (the option to disable it is in the OpenGL settings of the detonator 44.03 drivers). am i looking at the correct item?
 

modedepe

Diamond Member
May 11, 2003
3,474
0
0
You might be bounded by your cpu. A p4 1.6 @ 2.1 isn't going to be very fast, since I believe a 1.6 would be a willamette core..
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Disable Vsync... you might also consider downloading the coolbits registry hack... so you can be sure Vsync is disabled for DirectX as well.
 

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
modedepe:
its a 1.6A Northwood...478 pins. i dont know if its still possibly a limitation.

Jeff7181:
so i should try disabling Vsync now? i've already run coolbits. how does that ensure that Vsync is off? all i thought it did was unlock the OCing capabilities in the detonator drivers.
 

modedepe

Diamond Member
May 11, 2003
3,474
0
0
Ok, a 1.6A should be much faster than one of those old willamettes. So as the others have said, make sure vsync is off. Just open up display properties and go to setting, click advanced and then select the tab that says ti4200 or whatever on it. Then just click on opengl settings and make sure it's set to always off.
 

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
ok i've done just that. i don't know if it requires a restart or not...it didn't say. but i'm going to restart anyways and play a few minutes of UT and take some framerate measurements. in the mean time, considering i was originally getting roughly 60-65 frames, what kind of gains should i be seeing from a 250/446MHz to 325/560MHz OC?
 

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
well i went ahead and played some more UT...the frames are no better and no worse with the Vsync off than they were with the Vsync on. anymore more suggestions?
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Sunny129
well i went ahead and played some more UT...the frames are no better and no worse with the Vsync off than they were with the Vsync on. anymore more suggestions?

Yeah, benchmark on a newer game. UT is not a good judge of speed - it runs over 100 fps on my Ti4200 overclocked or not. It seems Vsynch is still enabled for you; make sure it's set to "always off". Even still, it's a poor game for comparing different settings.

If you want a better benchmark, try the UT2k3 demo, 3dMark 2001SE or even Quake 3. 3DMark 2k1 SE will give you a much better indication of how much faster your game performance is (in %) if you benchmark both overclocked and stock speeds on it.
 

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
Originally posted by: jiffylube1024
Originally posted by: Sunny129
well i went ahead and played some more UT...the frames are no better and no worse with the Vsync off than they were with the Vsync on. anymore more suggestions?

Yeah, benchmark on a newer game. UT is not a good judge of speed - it runs over 100 fps on my Ti4200 overclocked or not. It seems Vsynch is still enabled for you; make sure it's set to "always off". Even still, it's a poor game for comparing different settings.

If you want a better benchmark, try the UT2k3 demo, 3dMark 2001SE or even Quake 3. 3DMark 2k1 SE will give you a much better indication of how much faster your game performance is (in %) if you benchmark both overclocked and stock speeds on it.

no, Vsync has been set to "always off." its interesting that you get 100+ framerates in UT, b/c our systems are extremely similar. although you have an AMD CPU, my P4 1.6 is @2141MHz, about the same as yours. i also have 512MB of Samsung PC2700. the only difference in our video cards is that mine is a 128MB version, while yours is only 64MB. i dont know...my system just sucks i guess...

By the way, you're using Direct3D support, correct?
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Sunny129
Originally posted by: jiffylube1024
Originally posted by: Sunny129
well i went ahead and played some more UT...the frames are no better and no worse with the Vsync off than they were with the Vsync on. anymore more suggestions?

Yeah, benchmark on a newer game. UT is not a good judge of speed - it runs over 100 fps on my Ti4200 overclocked or not. It seems Vsynch is still enabled for you; make sure it's set to "always off". Even still, it's a poor game for comparing different settings.

If you want a better benchmark, try the UT2k3 demo, 3dMark 2001SE or even Quake 3. 3DMark 2k1 SE will give you a much better indication of how much faster your game performance is (in %) if you benchmark both overclocked and stock speeds on it.

no, Vsync has been set to "always off." its interesting that you get 100+ framerates in UT, b/c our systems are extremely similar. although you have an AMD CPU, my P4 1.6 is @2141MHz, about the same as yours. i also have 512MB of Samsung PC2700. the only difference in our video cards is that mine is a 128MB version, while yours is only 64MB. i dont know...my system just sucks i guess...

By the way, you're using Direct3D support, correct?

Aha! Nope, I'm using OpenGL with the high resolution textures and the proper settings. Direct3D is crap for UT, and performs like crap. Install the high res textures from CD2 of UT, and install the 4.36 patch. Then, in unrealtournament.ini (it's in your system folder in the UT directory) delete all of the settings under "[OpenGLDrv.OpenGLRenderDevice]" , and add this setting:

[OpenGLDrv.OpenGLRenderDevice]
RefreshRate=75
DetailTextures=1
UseTrilinear=1
UseS3TC=1
UseTNT=0
LODBias=0
UseMultiTexture=1
UsePalette=1
UseAlphaPalette=0
Translucency=1
VolumetricLighting=1
ShinySurfaces=1
Coronas=1
HighDetailActors=1
MaxAnisotropy=0
AlwaysMipmap=0
UsePrecache=0
SupportsLazyTextures=0


 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Also bear in mind that a Pentium 4 2.1 GHz is way slower than an AMD Athlon 2.1 Ghz. 2.2 GHz is 2800+ for AMD (equivalent to a 2.8 GHz P4).
 

Capster

Senior member
Jan 31, 2000
309
0
0
Since you mentioned UT I'm going to assume you're more interested in getting better framerates to improve your gaming experience rather than just have higher framerate numbers. With that in mind keep in mind that your monitor refresh rate is important in this equation. If it's set to 75, your monitor will not be able to display more than 75 frames per sec to your eyes.

BTW, Sunny, I run a similar video card and cpu. I'll be interested in your results. I played with oc'ing the 4200 many, many months ago and didn't see any real benefit so I went back to the default speeds. I never tested it with UT but have played alot of UT2k3 with it. You'll find me online when playing by the nick of Techslacker.
 

rogue1979

Diamond Member
Mar 14, 2001
3,062
0
0
Set the slider to high performance, turn off texture sharpening, AF and FSAA. Put your resolution to 1600 x 1200. You won't need FSAA at that setting and your image quality will still look fabulous. You should be pushing 100+ fps with a 128MB GF4 at those speeds on your 2.1GHz Northwood. If you can't get the option of 1600 x 1200 in the game, then go into the UT folder under system and open up the Unreal Tournament file that looks a letter with a yellow splotch on it. Scroll down until you see your current resolution and then change it manually to 1600 x 1200, save and exit.
 

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
Originally posted by: jiffylube1024
Aha! Nope, I'm using OpenGL with the high resolution textures and the proper settings. Direct3D is crap for UT, and performs like crap. Install the high res textures from CD2 of UT, and install the 4.36 patch. Then, in unrealtournament.ini (it's in your system folder in the UT directory) delete all of the settings under "[OpenGLDrv.OpenGLRenderDevice]" , and add this setting:

[OpenGLDrv.OpenGLRenderDevice]
RefreshRate=75
DetailTextures=1
UseTrilinear=1
UseS3TC=1
UseTNT=0
LODBias=0
UseMultiTexture=1
UsePalette=1
UseAlphaPalette=0
Translucency=1
VolumetricLighting=1
ShinySurfaces=1
Coronas=1
HighDetailActors=1
MaxAnisotropy=0
AlwaysMipmap=0
UsePrecache=0
SupportsLazyTextures=0

Might you recall what version of the OpenGLdrv.dll file you are using? the original file that came with the game sucks for me...everything is extremely dark. but now i am using version 2.3, and both image quality and framerates are better than they were with Direct3D. still, i'm only averaging about 80 fps, and unrealtournament.ini is almost identical to yours in the openGL section.

oh yeah, sorry it took me so long to get back to you all. i just moved into a new apartment and our wireless broadband isn't up yet, so i'm relying on my old telephone modem to get online. and because of the domain restriction on aol.com, i cannot post while on my modem. so i'll post again when i'm on my brother's LAN or when my broadband goes up.
 

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
Originally posted by: Capster
Since you mentioned UT I'm going to assume you're more interested in getting better framerates to improve your gaming experience rather than just have higher framerate numbers. With that in mind keep in mind that your monitor refresh rate is important in this equation. If it's set to 75, your monitor will not be able to display more than 75 frames per sec to your eyes.

BTW, Sunny, I run a similar video card and cpu. I'll be interested in your results. I played with oc'ing the 4200 many, many months ago and didn't see any real benefit so I went back to the default speeds. I never tested it with UT but have played alot of UT2k3 with it. You'll find me online when playing by the nick of Techslacker.

i hadn't really thought of that, but it makes alot of sense now that you mention it. i'm running UT at 1200 x 1024 @ 75Hz, so i'll only be seeing 75 fps at the most. however, the software still benches what fps the CPU/video card can produce, regardless of the monitor's refresh rate. it doesn't bother me now that i woke up the the fact that my current fps is just above my monitor's highest capable refresh rate at the current resolution of 1200 x 1024. it just puzzles me as to why my fps doesn't change at all as i OC the video card...especially from 250/446 to 325/580 or whatever it was i OCed it to before. despite the fact that UT is more CPU intensive than video card intensive, i should still see some increase, and not nothing at all.
 

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
Originally posted by: rogue1979
Set the slider to high performance, turn off texture sharpening, AF and FSAA. Put your resolution to 1600 x 1200. You won't need FSAA at that setting and your image quality will still look fabulous. You should be pushing 100+ fps with a 128MB GF4 at those speed on your 2.1GHz Northwood. If you can't get the option of 1600 x 1200 in the game, then go into the UT folder under system and open up the Unreal Tournament file that looks a letter with a yellow sploth on it. Scroll down until you see your current resolution and then change it manually to 1600 x 1200, save and exit.

i'm not sure where 2.1GHz came from, but i have a 1.6 GHz northwood. anyways, i know i can force a 1600 x 1200 resolution in the unrealtournament.ini, but my monitor only supports a 65Hz refresh rate at that high of a resolution, and it is just unbareable on my eyes, so that is not an option for me.
 

Intelman07

Senior member
Jul 18, 2002
969
0
0
This seems very odd make sure you dont have Unreal Tournament 2003 and Unreal Tournament mixed up lol...
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Sunny129
Originally posted by: rogue1979
Set the slider to high performance, turn off texture sharpening, AF and FSAA. Put your resolution to 1600 x 1200. You won't need FSAA at that setting and your image quality will still look fabulous. You should be pushing 100+ fps with a 128MB GF4 at those speed on your 2.1GHz Northwood. If you can't get the option of 1600 x 1200 in the game, then go into the UT folder under system and open up the Unreal Tournament file that looks a letter with a yellow sploth on it. Scroll down until you see your current resolution and then change it manually to 1600 x 1200, save and exit.

i'm not sure where 2.1GHz came from, but i have a 1.6 GHz northwood. anyways, i know i can force a 1600 x 1200 resolution in the unrealtournament.ini, but my monitor only supports a 65Hz refresh rate at that high of a resolution, and it is just unbareable on my eyes, so that is not an option for me.

Isn't your 1.6 northie overclocked to 2.164 GHz? That's where I got the 2.1 from.

Try running the game at 1024X768 and see what framerates you get.
 

gramboh

Platinum Member
May 3, 2003
2,207
0
0
There have been countless discussions about vsync/max framerates here and across USENET groups. The reason you want higher than vsync frame rates is because of how you perceive movement in the game (panning with your mouse). It always appears smoother with higher framerates. You can tell the difference between 200 and 100fps in quake3 very easily. Disable vsync and try it.
 

Sunny129

Diamond Member
Nov 14, 2000
4,823
6
81
Originally posted by: Intelman07
This seems very odd make sure you dont have Unreal Tournament 2003 and Unreal Tournament mixed up lol...

yeah, i'm talking about the old UT...haven't had a chance to install UT2003 with school and work in the picture.

Originally posted by: policy11
Games at 65fps are unbareable to your eyes?

no, i'm not saying that 65 fps hurts my eyes. i'm saying that a refresh rate of 65Hz hurts my eyes. you see, while you cannot physically see framerates higher than the monitor's refresh rate, a monitor still refreshes at the same rate no matter the actual fps in a game or whatever. in the case that fps are lower than the refresh rate, i'll use a mathematical example: fps in UT = 60, refresh rate = 120. although the fastest frames you'll see is 60 per second, the monitor still refreshes at 120 times per second (twice per frame). so although you might notice a slight "slide show" while panning quickly across the screen, you most definitely will NOT see the screen flickering that is normally associated with low refresh rates (i.e. 65Hz and below).

Originally posted by: jiffylube1024
Isn't your 1.6 northie overclocked to 2.164 GHz? That's where I got the 2.1 from.

Try running the game at 1024X768 and see what framerates you get.

I completely forgot i had it OCed to 2141 MHz...lol, my mistake. anyways, i've been running the game @ 1200 x 1024 lately, so i'll switch back to 1024 x 768 and see what happens, although i doubt it will make a significant difference.