• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

my FX5200 is sweet

ShadowBlade

Diamond Member
not kidding. I was playing HL2, UT2k4 and some other not so impressive games at 1024x768, medium to high settings and it ran smoothly, looked great.

128MB onboard ram 250MHz core/332MHz ram
it might be OC'ed by a couple MHz (no more than 5) because i was messing around with the clock freq. settings the other day
 
hahahahaha, check out the latest science on what the human eye can actually see , if l remember right its closer to 80 fps, it was on discovery, but it was pretty convincing, but anyways as i understand it fps on a tv screen and fps in a game are so totally different, if you think 20 fps will do it in a high intencity game, l think ya kidding.
At 20 fps you are only going to see a choppy mess, and at 40 its nice and at 80 its awsome, cant tell the difference with the human eye, i recon ya can, and refresh rate on ya screen, well you can tell the difference between 65 and 100 so stuffs me, i must be on drugs maybe ?
l drink alot but drugs...
 
This thread is funny.

I can notice up to about 115fps. Anything below 60 (minimum) is too choppy as far as I am concerned. It's different for everyone.
 
as long as the minimums are not much lower than 35, and the average is mid 30s or above i am quite happy for d3/hl2/farcry...for sims2 anything above 20 is fine...

but for UT i like at least a 60fps average...it's just so much quicker.
 
Originally posted by: nick1985
this thread HAS to be a joke

i'll go as far as to say a fx 5200 brutally overclocked is quite ok for UT and hl2 providiing you keep the eyecandy down at 1024 🙂
 
....5200 is teh su><orz

though i find hardly any difference between 30fps and 60fps since we're talking about that. i guess everyone is different
 
Originally posted by: ShadowBlade
^damn straight

lol. Twenty frames per second is playable IMO, but how couldn't you tell the difference? As the others are saying, try changing your monitor from 60Hz to pretty much any level higher and there will be a pretty noticeable difference.
 
hahahahaha , is getting quite funny but....
l cant play bf2 or bfv or hl at 25 odd fps, thats a joke , l can record ingame action at 10 20 30 40 50 60 fps with fraps, and straight up you can tell the difference between 20 fps 30 , 40 , 50 , 60, quite easily, at 20 its a touch choppy , at 40 its real nice, at 60 it EATS my hard drive ! 3.96 gig files........
You cant really compair old games in the same sence, try it on hl2 or bf2 , and see if you can tell the difference between 20 and 50, bet you can.
Old games are not coded, to as many moving objects on the screen and or pollygons, so they are a bad exsample, test it on a high intensity game and truelly see the difference.
 
my geforce 3 plays CSS at 67.34fps on medium quality and is on all the asterisk thingys in the video tab.

im happy, but i want nfsu2 to play 1280x1024 WITH light trails on.
 
my 9000igp on my laptop plays halflife 2 fine. It's because halflife scales itself down from dx9 to dx8 or dx7 so the card can run it smoothly. I've even run it at 16xaf with high quality on my 9000igp. But my 6600GT looks MUCH better.
 
Originally posted by: MX2times
I remember my 5200 that I had like three years ago could barely play Halo😕

Yeah, Halo hurt my FX5200 (13fps).. oddly it wasn't horrible in Doom 3 or Half-life 2, at 800 x 600 on medium.. it really wasn't that bad.. Haven't tried either Doom 3 or Half-life 2 on my 7800GTX yet, but I'm assuming it'll be playable.. haha
 
im not kidding at all
it plays those games great and the speeds I posted are what it shows in the "Clock Frequency Settings"

I can also play halo, but I havent played it recently
 
Back
Top