9600GT Doesn't Run 3DMark06's CPU Test?

geokilla

Platinum Member
Oct 14, 2006
2,012
3
81
Hi guys. I'm trying to find my max overclock for my 9600GT and I'm using FurMark and 3DMark06. I have two questions.

I thought that the new NVIDIA drivers would allow my 9600GT to run the CPU test with little effort in 3DMark06 thanks to CUDA and Physx. I'm using 177.92 drivers right now.

How can I tell whether my graphics card is FurMark stable? The only way I can tell right now is if I see some black lines on the screen and if the test stops by itself.
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
The cpu test in 3dmark 2006 is only for CPU and it will not be processed by the videocard with or without Physx.

For stability, it's better to use ATI tool. FurMark is more intensive and it can give you crashes or hard locks, instead of simple artefacts. It's best to avoid it, until you've reached some sort of stability with ATI tool, about one hour or so of artifact test.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,583
10,224
126
Then again, Furmark can run just fine, even when the overclock is unstable. I had several video freezes running 3DMark01, but Furmark ran just fine continuously.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
My card artifacts @ around 840-850 core, factory OC = 750, my clocks are 800\std\std
 

geokilla

Platinum Member
Oct 14, 2006
2,012
3
81
In terms of folding, the shaders are the most critical part of the graphic card. What about gaming? Before, I would overclock the memory and core clock, but now that I have shaders, would the importance be something like this?

Core > Shaders > Memory
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: geokilla
In terms of folding, the shaders are the most critical part of the graphic card. What about gaming? Before, I would overclock the memory and core clock, but now that I have shaders, would the importance be something like this?

Core > Shaders > Memory

Yeah. The core and shaders clocks are more important then memory clocks. Although, you'll reach a critical clock on the core/shaders where the memory frequency becomes very important. That's around 800 mhz on an 8800 GT, probably around that for a 9600 GT too.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Memory clocks are just as important if not more than shader clocks currently.

Just look at 9600gso with 96SP and 192bit bus get beat by 9600gt with 64SP and 256bit bus.

It's more like Core>memory>shader

You need memory to fully utilize the core properly.
 

geokilla

Platinum Member
Oct 14, 2006
2,012
3
81
Sorry for the dead thread revival, but I couldn't find this other thread I posted in before.

I'm using ATI Tool 0.26 to scan for artifacts, and it can't seem to detect my 9600GT except for the Core and Shader speeds and the temperature. Is it suppose to be like this since ATI Tool hasn't been updated for over a year?

Edit: I'm seeing yellow dots, which look to be artifacts on the scanner. Hoever, it's still saying that there's no errors...
 

Zap

Elite Member
Oct 13, 1999
22,377
7
81
Originally posted by: geokilla
I thought that the new NVIDIA drivers would allow my 9600GT to run the CPU test with little effort in 3DMark06 thanks to CUDA and Physx. I'm using 177.92 drivers right now.

Latest version is 178.24.

3DMark Vantage does the PhysX test.

Originally posted by: geokilla
Edit: I'm seeing yellow dots, which look to be artifacts on the scanner. Hoever, it's still saying that there's no errors...

Yellow dots = artifacts