AMD Revising Radeon HD 6900 Series PCB

mnewsham

Lifer
Oct 2, 2010
14,539
428
136
it was bound to happen eventually, but with the new PCB Design it looks like you should be able to OC better then current 6950's tho you wont be able to flash it to a 6970 win/lose i guess
 

OVerLoRDI

Diamond Member
Jan 22, 2006
5,494
4
81
This may accelerate my crossfire purchasing plans.. My current 6950 unlocked wonderfully and I'd like to do that again in the future, so buying another now rather than later looks like a good plan.

Edit: Did it... Another Gigabyte 6950 on the way.
 
Last edited:

Alaska Wolf

Member
Dec 18, 2010
67
0
0
RATS! I really want to get a second one for crossfire but won't have the Dinero for a couple of weeks. Guess I picked the wrong time to quit selling cocaine.

And I am also really thinking of FINALLY throwing a W/C system into my Cooler Master 840 ATCS. But now getting a waterblock for the XFX 6950 I currently have could well be harder. ARRGH!

On a side note other than E-Bay does anyone know a good place to get the 10CM (the longer one) Crossfire Bridge? Maybe I'll find a way to get the second card but don't want to run them stacked next to each other, especially if I'm cooling them with air.
 
Last edited:

OVerLoRDI

Diamond Member
Jan 22, 2006
5,494
4
81
Swapsies for one tht works with more than one monitor.

I thought you just were having issues with it not clocking down with multiple monitors. Which cards will definitely clock down less with more monitors attached. Mine currently clocks at 250/150 with a single monitor, but in a few days I will be back to school where my tri monitor setup lives. Reading your thread I'd say wait for Cat 10.13 since currently the 6900 drivers seem to be a bit awkward. (see my thread about powertune options not showing up for me)
 

WelshBloke

Lifer
Jan 12, 2005
30,447
8,110
136
I thought you just were having issues with it not clocking down with multiple monitors. Which cards will definitely clock down less with more monitors attached. Mine currently clocks at 250/150 with a single monitor, but in a few days I will be back to school where my tri monitor setup lives. Reading your thread I'd say wait for Cat 10.13 since currently the 6900 drivers seem to be a bit awkward. (see my thread about powertune options not showing up for me)


I dont think that drivers will make any difference.

On the other hand you may well be ok if your monitors are all the same resolution.

Just be aware lots of people are going to bullshit you about stuff they haven't actually tried.
 

Alaska Wolf

Member
Dec 18, 2010
67
0
0
Swapsies for one tht works with more than one monitor.


I haven't read through all of your thread about the hot 6950 but could you tell me if the problem you are experiencing is exacerbated because your running two different monitors? I really want to get three monitors to run using Eyefinity for games in addition to using my 52" Sammy for my primary.
 

WelshBloke

Lifer
Jan 12, 2005
30,447
8,110
136
I haven't read through all of your thread about the hot 6950 but could you tell me if the problem you are experiencing is exacerbated because your running two different monitors? I really want to get three monitors to run using Eyefinity for games in addition to using my 52" Sammy for my primary.

I think the whole problem is me running different monitors. But its not a quiet card anyway, as soon as the fan ramps up your going to notice it.

If you want anything tested get back to me soon as I'm returning it.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
I'm not familiar with the current VRMs or the new ones.. But from my understanding they used more expensive (and likely better performing) units on the current cards and the TI VRMs are simply cheaper. This may actually hurt clocks if the replacement is simply for cost savings.

Though as many folks think the new TI units are better maybe they are both cheaper and offer more performance.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
I haven't read through all of your thread about the hot 6950 but could you tell me if the problem you are experiencing is exacerbated because your running two different monitors? I really want to get three monitors to run using Eyefinity for games in addition to using my 52" Sammy for my primary.

My 6970 runs a touch hotter with my second display at idle but has amazing load temps compared to my old 4890. (49C at idle and 78ish at load). Something is funny with the wide variety of temps we seem to see.

The clocks on the GPU are a touch higher at idle with the second display but the big issue comes from the memory. GDDR5 simply does not clock down with a second display (at least not with a different resolution) without flickering, so they keep it higher.

So as for Welsh, you should have much better luck with a card that uses GDDR3 if you want all the down clocking you can get.. Though I'm almost certain that your card is flawed as I'm idling at less than 50C, and never climb above 80 with two displays (also the full load temps shouldn't change all that much with a second display unless you are using a lot more memory by gaming with a different game on each :D )
 

WelshBloke

Lifer
Jan 12, 2005
30,447
8,110
136
My 6970 runs a touch hotter with my second display at idle but has amazing load temps compared to my old 4890. (49C at idle and 78ish at load). Something is funny with the wide variety of temps we seem to see.

The clocks on the GPU are a touch higher at idle with the second display but the big issue comes from the memory. GDDR5 simply does not clock down with a second display (at least not with a different resolution) without flickering, so they keep it higher.

So as for Welsh, you should have much better luck with a card that uses GDDR3 if you want all the down clocking you can get.. Though I'm almost certain that your card is flawed as I'm idling at less than 50C, and never climb above 80 with two displays (also the full load temps shouldn't change all that much with a second display unless you are using a lot more memory by gaming with a different game on each :D )

Thats just not true if you are running monitors at different resolutions.

full 3D= 800/1325
2D desktop with 2 monitors=450/1250
2D desktop with 1 monitor=250/150
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
Thats just not true if you are running monitors at different resolutions.

full 3D= 800/1325
2D desktop with 2 monitors=450/1250
2D desktop with 1 monitor=250/150

What isn't true?

My 6970 clocks are halved with a second display. Which is still pretty good. Granted a touch may not be the right word given that they are quartered when I only use one display... so they are 100% higher with a second display at idle.... But it is still infinitely better than the cards of a couple generations ago would down clock.

That is besides the point though, it is the memory that makes the heaps of difference. My second display simply disables memory down clocking entirely, which was my point.
 

WelshBloke

Lifer
Jan 12, 2005
30,447
8,110
136
What isn't true?

My 6970 clocks are halved with a second display. Which is still pretty good. Granted a touch may not be the right word given that they are quartered when I only use one display... so they are 100% higher with a second display at idle.... But it is still infinitely better than the cards of a couple generations ago would down clock.

That is besides the point though, it is the memory that makes the heaps of difference. My second display simply disables memory down clocking entirely, which was my point.

What you said isn't true.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
What you said isn't true.

I said the GPU clocks with one and 2 displays is only a touch different.. as even with two they are still halved at idle. Also that the memory is the big issue as it simply doesn't downclock at all.. How is that not true?
 

WelshBloke

Lifer
Jan 12, 2005
30,447
8,110
136
I said the GPU clocks with one and 2 displays is only a touch different.. as even with two they are still halved at idle. Also that the memory is the big issue as it simply doesn't downclock at all.. How is that not true?

The clocks on the GPU are a touch higher at idle with the second display
A touch higher doesnt equal nearly double.
 

WelshBloke

Lifer
Jan 12, 2005
30,447
8,110
136
semantics, 1/4 or 1/2 is still a hell of a lot better than the 9/10 I saw on my old card. But fair enough.

Dude, thats not semantics, if the temps were double and I said they were nearly the same you'd (rightly) pull me up on it.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
The higher temps are due to the memory not downclocking and running at stock 3D settings when in idle with 2 screens. You can easily test it yourself - plug in one monitor only (clocks will go down to 157 / 300), run the fan speed at 100% for ~30 seconds for the card to cool down, leave it at auto and wait for the temps to level. Now use a tool to set idle 2D clocks at 400/300 and check your temp - it will go up 1C at most, if at all. Now set the GPU to 157 again and up the mem clocks to your stock 3D levels - BAM, temp will jump up like 5-10C. The increased frequency on the GPU when 2 screens are attached has hardly any influence on your temps.

And it's not the first time this has been reported. The "issue" is present since the first GDDR5 cards launched (+WDDM I assume). A combination of the memory controller, low GDDR5 frequency and 2 screens causes extreme flicker and jerkiness on the desktop. There is no way around it since HD4870 (believe me, I have tried and looked a lot). My HD4870 at idle with 2 screens was running at 80C and this was normal for the card... the temps would go down to ~45C with only 1 screen. I have found lower values for my HD4870 (was something like 300/500, can't remember now) that didn't cause problems and this is how I was running the card until I sold it.

I don't think this can be fixed with drivers. If it were possible, AMD would have done so already. Or nVidia. Both camps experience the same behavior. This is a trade that was accepted for doubling memory bandwidth (GDDR3 -> GDDR5).

Maybe when GDDR6 becomes available or either AMD or nVidia create their memory controllers from scratch, this behavior can be removed. Right now? No go.

I'm wondering though - is here someone with Windows XP that could check if there's the same behavior in XP? Maybe it's the WDDM driver architecture that's causing it (or requires this increase to run properly with 2 screens?). Or maybe a Linux distribution? No idea if the cards downclock under Linux at all though.

As for the thread, not to be completely off topic... That was quick by AMD :p And man, the unlock reminded me so much of the Radeon 9500 days! Ahh, the L-shaped memory models were so much in demand! :D

EDIT: I remember I bought an Accelero Twin Turbo for the HD4870. The 80C idle are the stock cooler.
 
Last edited:

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
Dude, thats not semantics, if the temps were double and I said they were nearly the same you'd (rightly) pull me up on it.

The difference is 200mhz out of a card that runs at nearly 1 ghz in my case. Doesn't seem like much to me. My old card clocked down to 800 from 900, the extra drop to 500 I see with this one is just neat is all.

FYI, enabling overdrive starts up a third clock profile in the bios as well. No OC and one display down clocks about twice as far as two displays with overdrive enabled falling in the middle.