So, finally getting around to OC'ing my 8800GT/9800GT

Dec 30, 2004
12,553
2
76
1). Which should I favor, core or shader? Decreasing one lets me increase the other a bit.

2). In googling for this, I ran into a couple people saying that running the memory at 1000Mhz (even though it is rated for 1000) can eventually break it (perhaps because no ramsinks. Is this so? Any way to tell my memory temps?

Thanks in advance.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
I believe JonnyGURU was the one who started topic. It wasn't disproven, but it wasn't proven either. Because it was a hot arguement, JonnyGURU recanted and basically said he washes his hands clean if anyone fries the memory controller as a result of the memory going over 2000Mhz

Anyway, I haven't really seen any reports of 8800GT's failing and I know that many people have clocked well past 2000Mhz for memory, but again, there really isn't proof one way or the other.

If you are worried about it, then I wouldn't clock it over 2000Mhz... If you are not worried, go ahead and have at it. I have my 8800GTS 512MB clocked at 2150, though it is a different card, the memory clocks were very similar at stock.

In general, both the core and shader increase is helpful... I wouldn't favor one over the other to be honest, because they are linked. However, if you find that one or the other is holding each other back, you can unlink them and increase them individually until you find the maximum stable clock with no artifacting or lock ups.
 
Dec 30, 2004
12,553
2
76
Originally posted by: ArchAngel777
I believe JonnyGURU was the one who started topic. It wasn't disproven, but it wasn't proven either. Because it was a hot arguement, JonnyGURU recanted and basically said he washes his hands clean if anyone fries the memory controller as a result of the memory going over 2000Mhz

Anyway, I haven't really seen any reports of 8800GT's failing and I know that many people have clocked well past 2000Mhz for memory, but again, there really isn't proof one way or the other.

If you are worried about it, then I wouldn't clock it over 2000Mhz... If you are not worried, go ahead and have at it. I have my 8800GTS 512MB clocked at 2150, though it is a different card, the memory clocks were very similar at stock.

In general, both the core and shader increase is helpful... I wouldn't favor one over the other to be honest, because they are linked. However, if you find that one or the other is holding each other back, you can unlink them and increase them individually until you find the maximum stable clock with no artifacting or lock ups.

Ok thanks. Yeah these posts about 2000mhz being bad were way back from Dec 07. Haven't heard much about 8800's failing.

BTW- how is that q6600 doing on your IP35-E? Any vdroop problems (vdroop getting worse)? Gillbot swears the IP35-E's 4 phase can't handle quads (esp a 65nm).
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
I had a Gigabyte 8800 GT , bought back in November 2007 and it was behaving weird when the memory was overclocked. For example, I've managed to pass some 300 minutes with ATI tool artifact scanner, with the memory at 2100 mhz, but after 10 minutes of playing any game, the screen became green or red and the computer used to froze. This happened until 1970 mhz or so, keeping it a bit lower then that and the problem was no more. For me, what JonnyGURU said was real. But I have the feeling that this applied only on GTs with Quimonda ram chips, the first cards that were released on the market.

Anyway, you'll get more performance out of your core and shader then out of the Vram. Memory frequency starts to become more important, once you exceed 800 mhz on the core, but that happens only with a volt mod. ;)