Question from a half-life junkie but a tech newbie.
I recently returned a BFG 6800 GT OC I bought 2 months ago
after it croaked. Before it died it got a 3Dmark03 score of 10500 with the 66.81 drivers. I just got my new one and it seems to work fine
however I am getting fluctuating test results from 3Dmark03.
Ive been changing the AGP aperture in the bios to see the
difference between 64mb 128mb 256mb 512mb. I just installed the 67.02 drivers and yes I used Driver Cleaner. I have a 430 watt power supply.
I just ran it twice at 512mb and the first time Ive got 10343 and
the second 10308. when I drop it down to 256MB it scores 10367. Is this much variance normal?
Another question: under my 3Dmark details it lists my VGA Memory Clock as 11.6 Mhz. My VGA Core Clock is 5.7 Mhz. Is this correct?
I called the tech guy at BFG and he had me turn on coolbits in the nVidia menu and showed me that my Core clock freq. is 370 Mhz and my Memory clock is 1.00Ghz. These are the default settings.
Shouldn?t the 3Dmark details read the same way? Before the tech guy had me do the coolbits he told me that he had the same videocard and his 3Dmarks said 370.Mhz Clock Freq.
The card seems to work well on HL2 with a minimum of stuttering at 1280X1024
4Xaa and 8Xaf with 35-75fps.
Im curious and would appreciate any advice.
I recently returned a BFG 6800 GT OC I bought 2 months ago
after it croaked. Before it died it got a 3Dmark03 score of 10500 with the 66.81 drivers. I just got my new one and it seems to work fine
however I am getting fluctuating test results from 3Dmark03.
Ive been changing the AGP aperture in the bios to see the
difference between 64mb 128mb 256mb 512mb. I just installed the 67.02 drivers and yes I used Driver Cleaner. I have a 430 watt power supply.
I just ran it twice at 512mb and the first time Ive got 10343 and
the second 10308. when I drop it down to 256MB it scores 10367. Is this much variance normal?
Another question: under my 3Dmark details it lists my VGA Memory Clock as 11.6 Mhz. My VGA Core Clock is 5.7 Mhz. Is this correct?
I called the tech guy at BFG and he had me turn on coolbits in the nVidia menu and showed me that my Core clock freq. is 370 Mhz and my Memory clock is 1.00Ghz. These are the default settings.
Shouldn?t the 3Dmark details read the same way? Before the tech guy had me do the coolbits he told me that he had the same videocard and his 3Dmarks said 370.Mhz Clock Freq.
The card seems to work well on HL2 with a minimum of stuttering at 1280X1024
4Xaa and 8Xaf with 35-75fps.
Im curious and would appreciate any advice.
