8800GTS 640 vs. HD4830 (power draw, 3dmark06)

May 5, 2006
96
0
0
I just swapped out my 8800GTS 640MB card for a single 4830 and ran a couple tests to see what kind of improvement it got me. I was mostly surprised at the power consumption of the nvidia card versus the new ATI (both at factory stock settings) for offering somewhat similar performance:

Power Draw (kill-a-watt)
8800 GTS system (idle/load): 135w/~220w
4830 system (idle/load): 91w/~175w

3DMark06 (4830/8800GTS)
3DMark Score 11184/10348 = + 8.1%
SM 2.0 Score 4477/4327 = + 3.5%
SM 3.0 Score 5108/4352 = + 17.4%
CPU Score 3249/3279 = - 1%

Thought I would post it up, should someone else find this info. useful. The 8800GTS hit 80c during 3DMark, while the 4830 didn't get hotter than 52C.

Of course, now the overclocking begins. :) The ATI overclock utility set my card at 690/1190 after running through the tests, but in no way was that stable. Backed it down to 690/1000 and I'm inching my way up from there. Already planning on sticking another 4830 in there...

The rest of my system:
Gigabyte EP45-UD3P
E8400 E0 @ 3.5GHz (stock cooling)
4GB Corsair Dominator
PCP&C 750W
Thermaltake Tsunami
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Funny how the CPU score would go down. One tends to wonder if the driver has anything to do with that (taking up more CPU time).
 

AmberClad

Diamond Member
Jul 23, 2005
4,914
0
0
Originally posted by: SunnyD
Funny how the CPU score would go down. One tends to wonder if the driver has anything to do with that (taking up more CPU time).
It went down by one percent. Margin of error, and all that. You're not going to get the exact same score between runs, even with identical hardware.
 
May 5, 2006
96
0
0
I seem to get wildly different CPU scores, especially on the CPU1 - Red Valley part of 3DMark06.

When I ran the 4830 at 690/1000 (CPU still at the same 3.5GHz), I got a 3019, or 8% lower than the 8800GTS 3279 CPU score. Maybe this is still within the margin of error, but anything more than a couple percentage points makes me wonder. I'll have to run some more tests tonight...
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: harlanpepper
I just swapped out my 8800GTS 640MB card for a single 4830 and ran a couple tests to see what kind of improvement it got me. I was mostly surprised at the power consumption of the nvidia card versus the new ATI (both at factory stock settings) for offering somewhat similar performance:

Power Draw (kill-a-watt)
8800 GTS system (idle/load): 135w/~220w
4830 system (idle/load): 91w/~175w

3DMark06 (4830/8800GTS)
3DMark Score 11184/10348 = + 8.1%
SM 2.0 Score 4477/4327 = + 3.5%
SM 3.0 Score 5108/4352 = + 17.4%
CPU Score 3249/3279 = - 1%

Thought I would post it up, should someone else find this info. useful. The 8800GTS hit 80c during 3DMark, while the 4830 didn't get hotter than 52C.

Of course, now the overclocking begins. :) The ATI overclock utility set my card at 690/1190 after running through the tests, but in no way was that stable. Backed it down to 690/1000 and I'm inching my way up from there. Already planning on sticking another 4830 in there...

The rest of my system:
Gigabyte EP45-UD3P
E8400 E0 @ 3.5GHz (stock cooling)
4GB Corsair Dominator
PCP&C 750W
Thermaltake Tsunami

4830 Crossfire is incredibly fast. And they overclock like beasts (15% framerate increase).

"We managed to push the dual GPUs' core right up to 708MHz, from the default 575MHz, representing a 23 per cent overclock. The memory, however, refused to budge an inch without causing visual corruption. The elevated frequencies returned an average framerate of 90.6fps in our Enemy Territory: Quake Wars test, up from the default 78.6fps - a nice, healthy increase."

Stock HD4830's taking out a GTX 280 OC edition 1GB (even at 2560 4xAA on a 30"):
http://www.hexus.net/content/item.php?item=16300&page=6

Overclocked they should be faster than a GTX 285 SSC all the way up.

 
May 5, 2006
96
0
0
I hope to find some time to do some more comparisons between the 8800 & 4830. I should have run some more benchmarks before I took the 8800 out, because now I don't want to put it back in :) Having too much fun with the new card.

I stumbled on that Crossfire 4830 Hexus review and couldn't believe how strong dual 4830s are. And those cards weren't even overclocked! It also suggests that maybe I should pick up COD4.

Crossfire/SLI never really made much sense to me until now.

Here are the before/after overclock 3DMark06 results on my HD4830 (stock 575/900 vs. 690/1000):

3DMark06 HD4830 (stock/overclocked)
3DMark Score 11184/11883 = + 6.3%
SM 2.0 Score 4477/4889 = + 9.2%
SM 3.0 Score 5108/5689 = + 11.3%
CPU Score 3249/3019 = - 8%

No idea what's going on with the CPU score drop, but some nice gains otherwise for minimal effort.

Edit: initially reported 640/1000 scores above by accident... adjusted and re-ran @ 690/1000 for a little better result. The extra 50MHz help a lot:

3DMark06 HD4830 (stock/overclocked)
3DMark Score 11184/12613 = + 12.8%
SM 2.0 Score 4477/5218 = + 16.6%
SM 3.0 Score 5108/5956 = + 16.6%
CPU Score 3249/3256 = + 0.2%
 
Dec 24, 2008
192
0
0
thats nice. I am also running HD 4830s at 16*10, and they nicely max out any games I play, and they are pretty demanding. Still grab that other 4830 quickly, you won't regret it.
just wondering what brand you are using? And your E8400 does way better than my phenom
 
May 5, 2006
96
0
0
My card is a Sapphire. I suspect I will be getting a second, though I'm wondering if they might drop the price by some significant amount in the near-ish future? That's probably asking a lot... it's already dirt-cheap for what it can do @ $85.

I ended up posting the wrong scores above. I thought it was at 690/1000, but it was really 640/1000. After I re-ran it at 690/1000, I got nice improvements in SM scores... both 16.6% over stock. E8400 overclock is up next...
 
Apr 20, 2008
10,067
990
126
I will be getting a 4830 tomorrow in the mail. I too, bought the Sapphire 4830 from newegg.

3dMARK06 is pretty much a bummer when you compares systems though. I get 10700 and change with my current system, despite your 8800GTS being a bit better then my 3850.
 

kY

Senior member
Feb 21, 2003
769
0
76
Great topic ... just saw the 4830 for $69AR today and was thinking if I could replace my old 640MB 8800GTS with something modern/cooler/less-power-hungry.

Have you noticed any significant perf increases when 3D gaming?

I play mainly BF2/2142 on my 8800GTS gaming PC. I reserve some slightly higher iron for Fallout 3.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
I think this is more of a testament to how efficient the HD4830 is.

Also you should post some real game tests, or at least post the framerate you get in the 3DMark tests. For examply, my 3DMark06 scores are 9888/4409/4799/2199. Obviously my CPU score is much lower than yours, and my framerates are: Return to Proxxy - 34.8, Firefly Forest - 38.68, Canyon Flight - 43.82, Deep Freeze - 52.17.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: cusideabelincoln
I think this is more of a testament to how efficient the HD4830 is.

Also you should post some real game tests, or at least post the framerate you get in the 3DMark tests. For examply, my 3DMark06 scores are 9888/4409/4799/2199. Obviously my CPU score is much lower than yours, and my framerates are: Return to Proxxy - 34.8, Firefly Forest - 38.68, Canyon Flight - 43.82, Deep Freeze - 52.17.

its not really all that efficient. I have a similar system to the op with a stronger overclocked gtx260 and I idle at 81 watts and have never seen it even get over 185 watts during gaming. of course his slight cpu oc might account for a few more watts though.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: cusideabelincoln
Similar is not the same, now is it? You have different power supplies, rated at different watts, with probably different efficiency curves.

See here: http://techreport.com/articles.x/15752/8

what does that have to do with the wattage reading we are getting since both of ours would be considered "at the wall" or am I missing something?
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: allies
Efficiency of your power supply....

but why does that matter if we are comparing two ratings at the wall while discussing a video cards power usage?
 

vj8usa

Senior member
Dec 19, 2005
975
0
0
Originally posted by: toyota
Originally posted by: allies
Efficiency of your power supply....

but why does that matter if we are comparing two ratings at the wall while discussing a video cards power usage?

A less efficient PSU will draw more power from the wall than a more efficient one when powering the exact same components.

Efficiency is the ratio of the power the PSU outputs to the power it draws from the wall (if it's drawing 400W from the wall and outputting 320W, it's 80% efficient).
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Like mentioned, "at the wall" is where an efficient power supply will make a difference. A power supply will draw more wattage at the wall than what the system components need. When running the exact same components, different efficiency power supplies will draw different amounts of power at the wall while they output the same amount of power to the components being used.

His power supply is rated for 750W, while yours is rated for 650W. Now I don't have an efficiency curve handy, but it's generally assumed that power supplies are most efficient at 50% load and less efficient with lower loads. Thus, if his system components only need 140W, and he's drawing 175W, then his power supply is 80% efficient at a load of 18% (140W out of 750W). Meanwhile, 140W out of 650W is a 22% load and your power supply is probably more efficient at this load than his 750W. And IIRC, Antec Earthwatt PSUs are definitely more efficient with lighter loads than higher loads. Since it's shown that the GTX260 needs more power than the HD4830, your "comparable" system would, in my hypothetical, need 190W at load. 190W would be a 29% load on the Antec 650, and it should be more efficient at this load than the 750W PCP&P is at an 18% load.

And your components and overclocks probably aren't the same, which will make power consumption numbers differ as well.