Why is my 2GB XFX HD6950 so hot?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Are there any aftermarket (non liquid) coolers for the 6950 that are more quiet than the stock cooler AND take up the same amount of space or less than the stock cooler?
No but there is an Acceler Xtreme Plus that will fit on it.

EDIT: You would need about and .5 to 1 inch more of clearance to use that cooler. On the other hand there is a Thermalright Shaman that you can use but the problem is it may be to wide for your case.
 

Jovec

Senior member
Feb 24, 2008
579
2
81
I stumbled on this thread and decided to do some super accurate testing (/sarcasm) with my Killl-a-Watt. I run two monitors, 16x10 and 12x10. The second monitor I only use for watching a movie or tv and is usually turned off. Note that turning the 2nd monitor off does not reduce GPU power draw, the monitor connection to the video card has to be unplugged. I also tried running the 16x10 monitor at 12x10 (with no GPU scaling), but this still resulted in 450/1250 clocks.

1090t, 6950 2gb, 8GB, 890GX, everything at stock:

Two monitors:
---
At 3.2GHz, total system draw was 137 watts idle. GPU ran at 450/1250
At 800MHz (CnQ), total draw was 132 watts. GPU at 450/1250

1 monitor:
---
At 3.2GHz, total system draw was 104 watts idle. GPU ran at 250/150
At 800MHz (CnQ), total draw was 99 watts. GPU at 250/150

So, going from 1.3v Vcore to 1.225Vcore (CnQ) is worth about 5 watts. As another aside, running an x4 instead of an x6 would be worth another 15-20 watts saving with CnQ since Vcore will drop to 1.0v on Phenom II quads.

Going from 2 monitors connected (with a different resolution) to 1 is worth about 33 watts. Another way to look at it is going from 450/1250 to 250/150 is worth 33 watts. At 24/7/365 usage and $0.12 kw/hr, these 33 extra watts cost roughly $35 a year. If we factor in s3 sleep and any time not running at idle clocks (since those times the wattage would be the same regardless) to be 50%, then it translates into $17 a year.

The reason I was interested in this is as follows.

As some point last year, something changed either with ATI's UVD drivers or with Microsoft's media foundation codecs/media center. I used to be able to play a game and watch tv simultaneously . Something changed, and now when trying to do those activities together my (at the time) 5850 would run at 400MHZ UVD clocks instead of 725MHz 3D clocks, making both the game and tv stutter horribly. My 5770 and 6950 exhibit this same behavior, as I assume all AMD cards do.

Since I don't do eyefinity gaming, this thread got me wondering about adding a second video card solely to power the second monitor. This should allow both cards to run at the lowest idle clocks for a net savings of about 20w if we assume about 10w idle for low-end 5 series cards. This should also allow full 3D clocks on the main card and UVD clocks on the 2nd card. I suppose I could try using the IGP too, but I worry about OC potential with the IGP enabled. From purely a cost savings standpoint, it might take around 2.5 years to recoup the cost of a 5450 via reduced electricity costs. Arguably not worth it, but two cards will allow for proper 3d clocks with UVD clocks on the second monitor/card.

Edit

Main monitor on 6950, 2nd on 4290 IGP:
---
At 3.2GHz, total system draw was 105 watts idle. 6950 at 450/1250, IGP at ~483/666
At 800MHz (CnQ), total draw was 100 watts. 6950 at 450/1250, IGP at ~483/666

Will run my current OC and update with any stability issues.
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Since I don't do eyefinity gaming, this thread got me wondering about adding a second video card solely to power the second monitor. This should allow both cards to run at the lowest idle clocks for a net savings of about 20w if we assume about 10w idle for low-end 5 series cards. This should also allow full 3D clocks on the main card and UVD clocks on the 2nd card. I suppose I could try using the IGP too, but I worry about OC potential with the IGP enabled. From purely a cost savings standpoint, it might take around 2.5 years to recoup the cost of a 5450 via reduced electricity costs. Arguably not worth it, but two cards will allow for proper 3d clocks with UVD clocks on the second monitor/card.

Can you use MSI afterburner to set your clockspeeds? (not all cards can be clocked by afterburner)

If you can then why not just use Afterburner, make a profile that forces clockspeeds up, assign a hotkey to the profile, and when you know you need the extra juice you just hotkey activate that profile?