PSU doesn't have 8 pin for new GPU

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Hey guys I am upgrading my Radeon 4890 to a 6970. I have an Antec TruePower 550 that supplies two 6 pin 12v lines.

Will I be ok if I use http://www.newegg.com/Product/Produc...82E16812198016

to connect to the 6+8 pin 6970? And also, will my 550W power supply be ok to run the 6970? The rest of my system consists of a 2500K, 2 IDE HDDs, a DVD-RW, and 4 120mm fans.
 

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
Hey guys I am upgrading my Radeon 4890 to a 6970. I have an Antec TruePower 550 that supplies two 6 pin 12v lines.

Will I be ok if I use http://www.newegg.com/Product/Produc...82E16812198016

to connect to the 6+8 pin 6970? And also, will my 550W power supply be ok to run the 6970? The rest of my system consists of a 2500K, 2 IDE HDDs, a DVD-RW, and 4 120mm fans.

http://forums.anandtech.com/showpost.php?p=32502261&postcount=5

Regarding the adapter, you probably just need to make sure that there's enough current sourced on that one 6-pin line to meet the requirements for the card. To be honest, I can't get comfortable with the thought of breaking six lines out to eight, especially a pair of them (it almost seems like the supply manufacturer telling me not to do this), but judging from the feedback it appears to be working for at least some people.
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
If you have a single rail PSU you will be more than fine splitting the 6 pin PCIe or a 4 pin Molex into a 8 pin PCIe using an adapter. When you get into the multi rail PSU's you will have to make sure the rail you are splitting has the power you need for the new card or you will run into trouble.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
If you have a single rail PSU you will be more than fine splitting the 6 pin PCIe or a 4 pin Molex into a 8 pin PCIe using an adapter. When you get into the multi rail PSU's you will have to make sure the rail you are splitting has the power you need for the new card or you will run into trouble.

Does that really work though in terms of the load (current, amperage) that is placed on the 6-pin lines when the GPU is drawing power with an expectation that the power is being delivered through an 8-pin supply?

Personally I'm leery of these physically compatible adaptors when used in electrical applications.

If 6-pins really could do the job expected of an 8-pin delivery then why did they bother creating an 8-pin delivery spec in the first place?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
No one really knows why. At least i never could get a great explanation out of Nvidia or AMD. The 6-pins will carry the current needed. The "extra" two pins are ground which means there is less resistance (and less heat). IF the PSU is adequate, you can easily get by.

i used to use an alligator clip to short the two extra pins to a molex ground (for my 2900XT) and i still have that same older 850W OCZ PSU working fine as a secondary PSU when i am using QuadFire (or overclock FX 8150 :p )
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Does that really work though in terms of the load (current, amperage) that is placed on the 6-pin lines when the GPU is drawing power with an expectation that the power is being delivered through an 8-pin supply?

Personally I'm leery of these physically compatible adaptors when used in electrical applications.

If 6-pins really could do the job expected of an 8-pin delivery then why did they bother creating an 8-pin delivery spec in the first place?

The reason is because there are two specs. 6 pin is 75 watts and 8 pin is 150 watts. The extra two grounds were put there to allow power supply makers to use one plug for both types of cards rather than having to wire two separate wires. Those plugs have the extra two pins detachable from the main 6.

Video cards can tell the difference in the plugs by detecting for ground on pins 7 and 8. If it isn't there they know it is a 6 pin rated cable.

Think of it as a way for the video card to determine if the power supply maker meant for the cable to be used for 75 watts or 150 watts. The video card only draws power from the same 6 pins no matter if it is 6 or 8 pins. It is a way for video card makers to insure that a user isn't trying to draw 150 watts off a supply or wire too small for the card.
 
Last edited:

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
i used to use an alligator clip to short the two extra pins to a molex ground (for my 2900XT) and i still have that same older 850W OCZ PSU working fine as a secondary PSU when i am using QuadFire (or overclock FX 8150 :p )


You can jumper the last two pins and some 8 pin connectors even do that with just a black wire going between the two pins as a jumper. It is really just a 'is the power supply large enough' jumper
 

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
The reason is because there are two specs. 6 pin is 75 watts and 8 pin is 150 watts. The extra two grounds were put there to allow power supply makers to use one plug for both types of cards rather than having to wire two separate wires. Those plugs have the extra two pins detachable from the main 6.

Video cards can tell the difference in the plugs by detecting for ground on pins 7 and 8. If it isn't there they know it is a 6 pin rated cable.

Think of it as a way for the video card to determine if the power supply maker meant for the cable to be used for 75 watts or 150 watts. The video card only draws power from the same 6 pins no matter if it is 6 or 8 pins. It is a way for video card makers to insure that a user isn't trying to draw 150 watts off a supply or wire too small for the card.

Learned something new today -- I did not know that those were both grounded. Thanks very much!