Originally posted by: SickBeast
My Asus 8800GTS 320mb does 625/1000.
I just got the card yesterday so I'm not sure if it's 100% stable at those speeds. I'm hoping to get a little more out of the core tho.Originally posted by: secretanchitman
Originally posted by: SickBeast
My Asus 8800GTS 320mb does 625/1000.
wow...1Ghz (2Ghz effective) on the memory? niiice. now you're past GTX speeds.
hopefully my 8800GTS 320MB coming in will do that...
Originally posted by: SickBeast
My Asus 8800GTS 320mb does 625/1000.
So far it's been 57C under load and at idle somehow. I'll have to test it more thoroughly, but the temps never seem to change on this card!Originally posted by: chizow
Originally posted by: SickBeast
My Asus 8800GTS 320mb does 625/1000.
That's about right for most 640MB GTS. I'd expect the core to OC the same on the 320MB, the only difference is in the memory ICs. Seems they hit the same speeds though, they're just 32MB versions of the same speed RAM. I'm running 630/1000 now and you can probably push it more to 650 or so on air, just need to keep an eye on your temps. If you go over 80C under load it might be a good idea to notch it down a little.
Originally posted by: CaiNaM
i see a few of these #'s in this thread, but the GTS core doesn't run at 620, 630, or 660Mhz.
if you set the clock frequency anywhere from 618Mhz to 634Mhz, the core runs at 621Mhz & 1458Mhz shader.
from 635-641 it's 648Mhz core, 1458Mhz shader.
from 642-661 it's 648Mhz core, 1512Mhz shader.
that's why some ppl will find it runs perfectly stable when set to 634, but artifacts & crashes at 635 (because in reality it's jumping from 621 to 648Mhz). in fact, i'd bet zizo who stated his was running @ 660 would have major issues @ 662 (cause @660 the core is actually running 648Mhz, and @ 662 it would try to jump all the way to 675Mhz).
the core increments i know are 594, 612, 621, 648, 675, 684. there's a chart floating around somewhere which shows set frequency and actual frequency from around 500Mhz to 759Mhz, but i don't recall off the top of my head..
Originally posted by: secretanchitman
well, instead of using ntune, would coolbits work? i was reading on some other forums (i think xtremesystems) that people have found a way to re-enable the classic control panel in the 97.92 winxp forceware drivers as well as coolbits itself (ocing panel, temp monitoring panel).
Originally posted by: MyStupidMouth
Originally posted by: secretanchitman
well, instead of using ntune, would coolbits work? i was reading on some other forums (i think xtremesystems) that people have found a way to re-enable the classic control panel in the 97.92 winxp forceware drivers as well as coolbits itself (ocing panel, temp monitoring panel).
You should use riva tuner.
Originally posted by: CaiNaM
i see a few of these #'s in this thread, but the GTS core doesn't run at 620, 630, or 660Mhz.
if you set the clock frequency anywhere from 618Mhz to 634Mhz, the core runs at 621Mhz & 1458Mhz shader.
from 635-641 it's 648Mhz core, 1458Mhz shader.
from 642-661 it's 648Mhz core, 1512Mhz shader.
that's why some ppl will find it runs perfectly stable when set to 634, but artifacts & crashes at 635 (because in reality it's jumping from 621 to 648Mhz). in fact, i'd bet zizo who stated his was running @ 660 would have major issues @ 662 (cause @660 the core is actually running 648Mhz, and @ 662 it would try to jump all the way to 675Mhz).
the core increments i know are 594, 612, 621, 648, 675, 684. there's a chart floating around somewhere which shows set frequency and actual frequency from around 500Mhz to 759Mhz, but i don't recall off the top of my head..
Um yes it does. your gonna have to elimulate the driver by going to power settings tab and putting in 1 of the old driver names.Originally posted by: chizow
Originally posted by: MyStupidMouth
Originally posted by: secretanchitman
well, instead of using ntune, would coolbits work? i was reading on some other forums (i think xtremesystems) that people have found a way to re-enable the classic control panel in the 97.92 winxp forceware drivers as well as coolbits itself (ocing panel, temp monitoring panel).
You should use riva tuner.
RivaTuner doesn't work for OCs with the Beta 100-series drivers. I got ATI Tools to work though.
And ya, good point on the clock straps. I'm at 621 core/1458 shader. Core idles at 55-60C but can heat up to 65-75C under load depending on the game. Good case cooling is really important, but the stock HSF is really excellent. Its easily the best stock HSF I've had on any card.
Originally posted by: CaiNaM
i see a few of these #'s in this thread, but the GTS core doesn't run at 620, 630, or 660Mhz.
if you set the clock frequency anywhere from 618Mhz to 634Mhz, the core runs at 621Mhz & 1458Mhz shader.
from 635-641 it's 648Mhz core, 1458Mhz shader.
from 642-661 it's 648Mhz core, 1512Mhz shader.
that's why some ppl will find it runs perfectly stable when set to 634, but artifacts & crashes at 635 (because in reality it's jumping from 621 to 648Mhz). in fact, i'd bet zizo who stated his was running @ 660 would have major issues @ 662 (cause @660 the core is actually running 648Mhz, and @ 662 it would try to jump all the way to 675Mhz).
the core increments i know are 594, 612, 621, 648, 675, 684. there's a chart floating around somewhere which shows set frequency and actual frequency from around 500Mhz to 759Mhz, but i don't recall off the top of my head..
I bookmarked the thread for my own use. Enjoy:Originally posted by: zizo
I used ntune in windows xp. I read someone fixed the issue with ntune and vista but couldn't find the link again!
That's interesting. My card actually crashed when I had it set at 625/1000, so I've backed it down to 615/980 and it's fine so far. Is it really running at 615? Does the core run at a certain increment?Originally posted by: CaiNaM
i see a few of these #'s in this thread, but the GTS core doesn't run at 620, 630, or 660Mhz.
if you set the clock frequency anywhere from 618Mhz to 634Mhz, the core runs at 621Mhz & 1458Mhz shader.
from 635-641 it's 648Mhz core, 1458Mhz shader.
from 642-661 it's 648Mhz core, 1512Mhz shader.
that's why some ppl will find it runs perfectly stable when set to 634, but artifacts & crashes at 635 (because in reality it's jumping from 621 to 648Mhz). in fact, i'd bet zizo who stated his was running @ 660 would have major issues @ 662 (cause @660 the core is actually running 648Mhz, and @ 662 it would try to jump all the way to 675Mhz).
the core increments i know are 594, 612, 621, 648, 675, 684. there's a chart floating around somewhere which shows set frequency and actual frequency from around 500Mhz to 759Mhz, but i don't recall off the top of my head..
Originally posted by: nullpointerus
I bookmarked the thread for my own use. Enjoy:Originally posted by: zizo
I used ntune in windows xp. I read someone fixed the issue with ntune and vista but couldn't find the link again!
http://forums.anandtech.com/messageview...atid=31&threadid=2018395&enterthread=y
Actually, I did not try it. nTune was apparently causing BSOD's on startup. It's never really worked well with my system board anyway, but other people really seem to like nTune, so why don't you give it a try?Originally posted by: zizo
tnx for the link. So did it work?Originally posted by: nullpointerus
I bookmarked the thread for my own use. Enjoy:Originally posted by: zizo
I used ntune in windows xp. I read someone fixed the issue with ntune and vista but couldn't find the link again!
http://forums.anandtech.com/messageview...atid=31&threadid=2018395&enterthread=y