I highly doubt that you get 210FPS...in anything. 120 I believe. Unless you changed something (and I'm not a UT expert), UT gives Frames Per Second, not Frames per millisecond. I get 80 FPS at 1024 x 768 x 32 @100Hz refresh. Someone with a 64mb vidcard probably does 100-120 at the same settings.
It could very well be true if you have all of the details maxed and such. 13-21 FPS would look pretty choppy if that is really what you're running at. I usually run UT at 800x600@32 with medium level detail on a Hercules GF2 64 and I get around 60 or 70 FPS. It most likely depends on your graphics settings. I know UT runs better in OpenGL than in D3D, so you might want to try using that. I believe you need a special set of OGL drivers though, so you might want to do a search in the forums for them because I know they've been posted before. Hope this helps a little.
<< Someone with a 64mb vidcard probably does 100-120 at the same settings. >>
With my rig, and 64MB GTS, I get about 125-130FPS at 1280x1024x32 @85Hz. Also, UT is measured in FPS, not FPMS. If its in FPMS, I have no idea how you got it that way..
there are 1000 milliseconds in a second. If you wnat to run a timedemo, press the ~ while the flyby intro is running and type "timedemo 1" (without the "
To enable Timedemo, at the startup screen, where you have the menu bar across the top, pick "options". Check Timedemo. Play any game. In the middle of the screen at the extreme right edge it shows Average FPS and right underneath that it shows Last second. Simple
Ummm, ok if you know that the command is "SHOWS FRAME TIME IN MILLISECONDS", then why not try something like "shows frame time in seconds"? I mean, come on, play around a bit, look in your ini files, and see what you find.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.