Fixed: Yeah, not happy with this HD6950 (excessive heat with multimonitor problem)

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

SirGCal

Member
May 11, 2005
122
1
0
www.sirgcal.com
Here's even a bit more of how this might work from the inside point of view...

Say I am sorting a wafer of CPUs for example (might be the easiest to understand)... Say a wafer intended to be sold as Phenom II 970's. Now say just for example, 15% of that wafer tested as "Bin A". That might mean that they passed without any problems at the fully rated and intended speed. Another 30% might pass as "Bin B" meaning they passed after there was a small repair made to the die itself in some various form. Perfectly functional, but not the truest form of 'best'. Another bunch of them might be "Bin C" which might mean they disabled one of the cores and have a passing three-core now. Or maybe another bin which means it did not pass at full speed but 200MHz slower... These will be sold as the 960's (or whatever)...

All from the exact same product wafer. Now "Company A" pays a premium to only receive "Bin A" products, period. "Company B" will accept any product, pays less, but rarely, if ever, gets "Bin A" parts as a result of feeding the premium companies first. "Company C" might actually take "Bin C" parts and try to further repair them to get all the cores running again for a real budget build (although this type of tweak in particular may not meet the reference requirements actually, but it's just an example of what some budget companies do).

The resulting difference is that Company A, B and C can all make the exact same system, to the letter. But the chances of the best, smoothest, and most reliable operation fall to Company A. Company B tends to feed the mainstream, parts might not overclock as well, or may be a bit louder or hotter, etc. Even though they are the exact same parts. Company C might be the sweetest bang for the buck out there, but reliability might be really horrible. Getting the 'good product' can be real hit and miss from them. But perhaps even the bulk of their products work without a hitch for most general users.

I hope that's helping make sense, cause I really don't know how to explain it much better without going into specific trade information which I can not.
 

SirGCal

Member
May 11, 2005
122
1
0
www.sirgcal.com
Here's another quick point directly related to some of you; the specs for the reference setup say for example that you have to use this type of fan that spins at this speed. However, the exact same type of fan from different manufacturers can vary greatly in audible noise, turbulence and even CFM capabilty... OR the specs might say this type of fan capable of moving a specific amount of CFM... So one company might buy the one that can do that slower and quieter.... But they can NOT say "You must purchase the 'turbo-plus-super-fan' from manufacturer 'Fastfans' to use on your reference products" for example. The only exceptions to that are the parts they make themselves (they must purchase the Cayman chips for example, from them obvious, to be used in their cards); but that's a no-brainer.
 

c_g_f

Junior Member
Feb 6, 2011
2
0
0
Right very quick follow up then I have to go.

Cat 10:12a hotfix linked earlier in the thread.

Ticked enable overdrive.

Installed afterburner 2.1.0 beta 5.

Leave the core clock alone, dial down the memory as low as you can go without corruption (mines 675) this automatically drags the core clock down to 250. (bolded is where I was going wrong earlier).

Set up your auto 2d/3d profiles in afterburner and set up a fan profile that cools but doesn't annoy you too much.

You will have to mess around with the rivatuner stats server to remove some apps being falsly flagged as 3d when they arnt. (click on the i on the upper right of after buner and check the active 3d process line.)

Good luck.

Hi all, I've just registered to tell you that by dialing down the memory to the minimum (625) AND setting max core clocks to 500 you can run multiple monitors just fine (memory sets at 625 and core at 250... running 3 monitors in extended mode as I write).. thought I will share this with you, cheers!
 

WelshBloke

Lifer
Jan 12, 2005
33,112
11,292
136
Hi all, I've just registered to tell you that by dialing down the memory to the minimum (625) AND setting max core clocks to 500 you can run multiple monitors just fine (memory sets at 625 and core at 250... running 3 monitors in extended mode as I write).. thought I will share this with you, cheers!


You dont need to touch the core clock slider (in afterburner) your clocks will automatically scale down with your memory. (I dont know if this is a bug in AB but its handy)

Also watch out for display corruption if you dial down too far. My second display gets garbled below about 670 mem.

I cant get AB to let me undervolt at idle though, the card will only idle at 1V, I can only change the volts when the 3D setting kicks in.


Oh, and dont try to run 3D settings at 0.9V :$
 

c_g_f

Junior Member
Feb 6, 2011
2
0
0
You dont need to touch the core clock slider (in afterburner) your clocks will automatically scale down with your memory. (I dont know if this is a bug in AB but its handy)

Also watch out for display corruption if you dial down too far. My second display gets garbled below about 670 mem.

I cant get AB to let me undervolt at idle though, the card will only idle at 1V, I can only change the volts when the 3D setting kicks in.


Oh, and dont try to run 3D settings at 0.9V :$

I haven't touched the volts at all.. First I tried setting memory clocks at minimum (625) without touching core clocks and got screen corruption as you said, so put memory clocks at 675.. but then I tried putting memory clocks at minimum while also dialing down core clocks.. with memory at minimum, the minimum core clocks AB let me chose was 500, but once you apply the setting, core clocks set at actually 250 (vs 450 default) and that's just cool.. it's still hotter than running just one display but not as hot as default..
 

mscrivo

Member
Mar 22, 2007
57
0
66
Just wanted to echo the sentiments of the OP. I just picked up a 6950 and was completely surprised by how loud it was just idling on the desktop. My temp was 61 degrees! When I disabled the second monitor, it dropped down to 44 degrees eventually. Even using MSI afterburner, the best I can get is 250/625 clocks and 55 degrees. That's a far cry from 44 and the fan is still pretty audible at idle. I can't believe the card needs to do that much extra work just to drive a second monitor at the same resolution.
 

ooShinkooo

Junior Member
Jul 1, 2011
1
0
61
Guys, these cards are on the upper end of ATI's Range of Graphics Cards. Power efficiency, Heat, and Noise are going to come secondary to Performance. If you are interested in "cooler" running cards, look into the 68XX Series..

Also, the maximum temperature for these cards can go to 90c. If High Temps are that much of a problem, my recommendations are
1. Invest with a Bigger Case that has better Airflow (HAF932 For Example),
2. Build a Waterloop

I have 2 6950's powering 3 monitors at idle, or 1 Monitor Running Crysis 2 in Crossfire Mode, and don't seem to have the Temperature problem you guys make a reference to.
 

Makaveli

Diamond Member
Feb 8, 2002
4,976
1,571
136
You need to slap an aftermarket cooler on your card.

When I was on dual monitors on the 6950 I and at 500mhz idle clocks I was hovering around 45c Idle temps.

I have now gone to a single monitor clocks have dropped to 250mhz idle and temp is now 38c.

Aftermarket cooler in my sig!
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Thread title should be changed or edited to a "Fixed" as it is stated in the OP that the problem is now indeed fixed.

This would probably also help on necros such as this.
 
Status
Not open for further replies.