Just volt modded 7900GT.. performance difference not much?!?!
I'm currently benching my 7900GT and I just volt modded the card to 1.4V and shaded the resistor for the memory..
Did I do something wrong, or are these results really that miniscule?
3DMark03
7900GT 550/790 (Default) - 19828
7900GT 579/855 (Auto-detect) - 20791
7900GT 650/890 (1.4V mod) - 22276
Going from 550/790 (Default) to 650/890 is roughly a 18% increase in core clock and 13% increase in memory clock. In 3DMark03, I got a 12% increase in scores.
3DMark05
7900GT 550/790 (Default) - 9376
7900GT 579/855 (Auto-detect) - 9867
7900GT 650/890 (1.4V mod) - 10689
Going from 550/790 (Default) to 650/890 is roughly a 18% increase in core clock and 13% increase in memory clock. In 3DMark05, I got a 14% increase in scores.
3DMark06
7900GT 550/790 (Default) - 4674
7900GT 579/855 (Auto-detect) - 4870
7900GT 650/890 (1.4V mod) - 5213
7900GT 685/915 (1.4V mod) - 5257 12
Going from 550/790 (Default) to 685/915 is roughly a 25% increase in core clock and 16% increase in memory clock. In 3DMark06, I got a 12% increase in scores.
After looking at my scores, can you guys tell me if this is bad, normal, or excellent? Cause I sure as hell can't tell.
Here is my test bed:
DFI NF4 SLI-DR
Opteron 144 @ 2.7GHz
G.Skill 2GB ZX @ DDR500 3-3-2-5
WD Raptor 74GB x 2 RAID-0
eVGA 7900GT CO SC w/ VF700Cu
Audigy X-Fi
OCZ GameXStream 600W
Futuremark was installed on the RAID-0 raptors and they were defragged once last night before running the tests. All applications were shut down except MBM5 (for temperature monitoring) and anti-virus. Windows was heavily tweaked to run only on essential processes. All tests were run on default settings.
Now I have some questions and concerns about the 7900GT card. Initially after volt modding the card, I used nVidia's integrated Coolbits to "detect optimal frequency", but that immediately made my computer artifact and blink. I'm guessing it's the memory chips on the card overheating so I manually increase the clocks and used the test function in Coolbits to test. I got up to 685/915 but I think I can do a bit more. Haven't tried yet because I'm taking a break writing this post.
Anyways, is there any way to verify that the mod has been accomplished successfully WITHOUT taking the card and mobo out and using the multimeter to check for 1.4V on the GPU and 2.25V on the memory? And is there any way to run this thing cooler? I have a VF700Cu with 8 RAMsinks , an additional 2 RAMsinks for the memory regulators on the back, AND a 120mm side fan blowing on part of the card. Temperatures still at 52C idle and shot up to the 70s on full load! I'm really surprised because it was only 41C idle/55C load on my non-modded, non-OCed 7900GT w/ VF700Cu without the 10 RAMsinks.
I'm currently benching my 7900GT and I just volt modded the card to 1.4V and shaded the resistor for the memory..
Did I do something wrong, or are these results really that miniscule?
3DMark03
7900GT 550/790 (Default) - 19828
7900GT 579/855 (Auto-detect) - 20791
7900GT 650/890 (1.4V mod) - 22276
Going from 550/790 (Default) to 650/890 is roughly a 18% increase in core clock and 13% increase in memory clock. In 3DMark03, I got a 12% increase in scores.
3DMark05
7900GT 550/790 (Default) - 9376
7900GT 579/855 (Auto-detect) - 9867
7900GT 650/890 (1.4V mod) - 10689
Going from 550/790 (Default) to 650/890 is roughly a 18% increase in core clock and 13% increase in memory clock. In 3DMark05, I got a 14% increase in scores.
3DMark06
7900GT 550/790 (Default) - 4674
7900GT 579/855 (Auto-detect) - 4870
7900GT 650/890 (1.4V mod) - 5213
7900GT 685/915 (1.4V mod) - 5257 12
Going from 550/790 (Default) to 685/915 is roughly a 25% increase in core clock and 16% increase in memory clock. In 3DMark06, I got a 12% increase in scores.
After looking at my scores, can you guys tell me if this is bad, normal, or excellent? Cause I sure as hell can't tell.
Here is my test bed:
DFI NF4 SLI-DR
Opteron 144 @ 2.7GHz
G.Skill 2GB ZX @ DDR500 3-3-2-5
WD Raptor 74GB x 2 RAID-0
eVGA 7900GT CO SC w/ VF700Cu
Audigy X-Fi
OCZ GameXStream 600W
Futuremark was installed on the RAID-0 raptors and they were defragged once last night before running the tests. All applications were shut down except MBM5 (for temperature monitoring) and anti-virus. Windows was heavily tweaked to run only on essential processes. All tests were run on default settings.
Now I have some questions and concerns about the 7900GT card. Initially after volt modding the card, I used nVidia's integrated Coolbits to "detect optimal frequency", but that immediately made my computer artifact and blink. I'm guessing it's the memory chips on the card overheating so I manually increase the clocks and used the test function in Coolbits to test. I got up to 685/915 but I think I can do a bit more. Haven't tried yet because I'm taking a break writing this post.
Anyways, is there any way to verify that the mod has been accomplished successfully WITHOUT taking the card and mobo out and using the multimeter to check for 1.4V on the GPU and 2.25V on the memory? And is there any way to run this thing cooler? I have a VF700Cu with 8 RAMsinks , an additional 2 RAMsinks for the memory regulators on the back, AND a 120mm side fan blowing on part of the card. Temperatures still at 52C idle and shot up to the 70s on full load! I'm really surprised because it was only 41C idle/55C load on my non-modded, non-OCed 7900GT w/ VF700Cu without the 10 RAMsinks.