PCI EXPRESS 2 - 8 pin adaptor?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Product Details:

*** NEW Custom made Male 6-pin PCI Express to Female 8-pin PCI Express 2 adapter cable from Performance-PCs.com ***

This adapter can be used to convert any 6-pin power supply PCI Express cable to a 8-pin PCI Express 2 cable. This adapter is handy for any modern PSU you recently purchased that needs the 8-pin PCI Express 2 cable connection such as for the new Radeon HD 2900 XT.

nice
 

ashishmishra

Senior member
Nov 23, 2005
906
0
76
Hmm..the 6-Pin PciE cable just provides 75W, so how does this adapter force this cable to give 75 more watts :confused:?? I would think they need a 6-pin plus a couple of molexes to do the conversion properly, or does this just fool the card into thinking that a true 8-pin has been connected, so Overdrive can be unlocked. Any experts here who can comment?

Originally posted by: Neutronbeam01
here ya go -- and I may have given them the idea to make these...right here at www.peformance-pcs.com --

Adapter to convert 6-pin PCI express to 8-pin PCI express 2.0



They will also custom make any kind of adapter you want.

 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Originally posted by: apoppin
you would need TWO 6-pin PCIe connectors, right ... to make ONE 8-pin connector ... and you need 2 - 8pin connectors ...
...right?
:confused:

erh, no, you need 1 6 and 1 8 pin for OC'ing, so that 75w + 150w = 225w to the card.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
but 2 6 pin and 2 8 pin for X-fire

so 4 adapters, right?

it doesn't make any sense . ... . a 6pin to 8pin is only supplying the SAME 75W


EDIT ... i won't bump this ... but i THINK i figured it out
... maybe not ... i should just buy one, right? :p

PCIe slot =75w
1x 6pin PCIE = 75W
1x 8pin PCIE = 150W [which can't just be a 6pin to 8 pin adapter or it would still be 75w]

... so 300w to OC it? 225w for just the 75w slot, the 2 x 6pin PCIE 150w
:Q

600w for 2 oc'd HD2900xts
:shocked:

and connectors and adapters hanging everywhere

tell me it isn't so
 

fern420

Member
Dec 3, 2005
170
0
0
Originally posted by: apoppin
but 2 6 pin and 2 8 pin for X-fire

so 4 adapters, right?

it doesn't make any sense . ... . a 6pin to 8pin is only supplying the SAME 75W


EDIT ... i won't bump this ... but i THINK i figured it out
... maybe not ... i should just buy one, right? :p

PCIe slot =75w
1x 6pin PCIE = 75W
1x 8pin PCIE = 150W [which can't just be a 6pin to 8 pin adapter or it would still be 75w]

... so 300w to OC it? 225w for just the 75w slot, the 2 x 6pin PCIE 150w
:Q

600w for 2 oc'd HD2900xts
:shocked:

and connectors and adapters hanging everywhere

tell me it isn't so

the math seem correct but it cant be a full 600 watts for two 2900's in xfire. that juicebox power supply i ordered was listed as 2900xt crossfire certified by amd and ati and its only 450 watts nominal but i didnt see a max rating, it could very well hit 600 watts on max. regardless i dont think that 600 watt number is very far off at all, id bet somewhere in the 500-600 range for two 2900's overclocked.

honestly, if you are going to do two 2900's in crossfire and you have a nice power supply that hasn't reached the end of its usefulness the best bet is to buy that juicebox i did for 100 bucks and have no worries about a zillion adapters and a rats nest.

great find by the way Neutronbeam01, i knew someone would make one but it still isnt going to put the correct watts to the card but perhaps will unlock the overdrive for people with nice beefy power supplies.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: apoppin
but 2 6 pin and 2 8 pin for X-fire

so 4 adapters, right?

it doesn't make any sense . ... . a 6pin to 8pin is only supplying the SAME 75W


EDIT ... i won't bump this ... but i THINK i figured it out
... maybe not ... i should just buy one, right? :p

PCIe slot =75w
1x 6pin PCIE = 75W
1x 8pin PCIE = 150W [which can't just be a 6pin to 8 pin adapter or it would still be 75w]

... so 300w to OC it? 225w for just the 75w slot, the 2 x 6pin PCIE 150w
:Q

600w for 2 oc'd HD2900xts
:shocked:

and connectors and adapters hanging everywhere

tell me it isn't so

No, dude, you are like totally looking at this the wrong way.

Get a custom side cover w/ intake scoops and paint your case fire engine red. Horsepower TV FTW!

LOL

But sadly, you're correct about the figures and adapters. :( However, I think 300w per overclocked 2900XT is a safety precaution (max load), so the game doesn't crash periodically when power requirements spike.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: fern420
Originally posted by: apoppin
but 2 6 pin and 2 8 pin for X-fire

so 4 adapters, right?

it doesn't make any sense . ... . a 6pin to 8pin is only supplying the SAME 75W


EDIT ... i won't bump this ... but i THINK i figured it out
... maybe not ... i should just buy one, right? :p

PCIe slot =75w
1x 6pin PCIE = 75W
1x 8pin PCIE = 150W [which can't just be a 6pin to 8 pin adapter or it would still be 75w]

... so 300w to OC it? 225w for just the 75w slot, the 2 x 6pin PCIE 150w
:Q

600w for 2 oc'd HD2900xts
:shocked:

and connectors and adapters hanging everywhere

tell me it isn't so

the math seem correct but it cant be a full 600 watts for two 2900's in xfire. that juicebox power supply i ordered was listed as 2900xt crossfire certified by amd and ati and its only 450 watts nominal but i didnt see a max rating, it could very well hit 600 watts on max. regardless i dont think that 600 watt number is very far off at all, id bet somewhere in the 500-600 range for two 2900's overclocked.

honestly, if you are going to do two 2900's in crossfire and you have a nice power supply that hasn't reached the end of its usefulness the best bet is to buy that juicebox i did for 100 bucks and have no worries about a zillion adapters and a rats nest.

great find by the way Neutronbeam01, i knew someone would make one but it still isnt going to put the correct watts to the card but perhaps will unlock the overdrive for people with nice beefy power supplies.

they need to make an adapter using two 75w connectors into an 8-pin 150w connector

as to your "booster" ... i believe "all" it needs to do is supply the "extra" needed for x-fire/OC'ing ... the two "extra" 8-pin connectors ... one for each card to supply the extra oomph for each card ... at least 300w ... your original PS supplies everything else for "normal" operation

honestly ... after reading all this ... i am probably backing away from HD2900xt ...:brokenheart:
i dunno ... it's gonna ALL depend on pricing ... for me
:confused:

of course, i'd have NO problem running 8800Ultra SLI on my current PS
---just on my pocketbook :p
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
OK, after a lot of coaxing, I got the 8.38 RC7 drivers to install. Despite what the setup program reported, it didn't uninstall the 8.36 drivers, so I was left with a new CCC and old drivers; also, the system was very unstable and would crash explorer/opera quite often. I finally fixed this by (a) uninstalling all ATI stuff via Add/Remove Programs and (b) manually uninstalling the 8.36 drivers from Device Manager w/ the "delete this driver software" box checked. Once I rebooted, I could then run the 8.38 setup program. So much for the CIM's driver upgrade button!

I got one blue screen from starting Media Center after I had browsed various CCC tabs, but MC seems to start fine now. Also, 8.38 RC7 has the same awful video motion/interlacing problem as all previous HD2900XT drivers. Fullscreen MPEG2 playback is so awful (i.e. a smeary, pixelated mess) than it's barely watchable. When people are walking, you can barely make out their faces. Manually forcing AVIVO Quality to "Bob" still works fine. Night and day difference here.

The drivers that boosted performance so much were *older* than the drivers that the review sites used. 8.36 had the best performance for my rig and games. NFS:C was fluid, better than I've ever seen it on Vista. Oblivion was good, playable but not overwhelmingly better. I did learn through testing that the CCC's AA forcing option was bugged and would simply stop antialiasing Oblivion after a certain point (a load screen?). Resetting the CCC AA to the desired value and immediately testing revealed high AA performance w/ AA working.

8.37 was awful--Oblivion had a noticable performance decrease, and NFS:C was unplayable both in framerate and perceived fluidity. AA did work in my games, FWIW.

8.38 is slightly better. Performance seems to be improved over 8.37, but it's certainly not as high as 8.36. I'll check on the AA failing bug, but I assume it has been fixed. Oblivion looks like it got a good performance boost. NFS:C has much higher framerates in 8.38 than 8.37 and is almost on par with 8.36, but 8.37 and 8.38 suffer from the same "stuttering" (i.e. random half-second pauses) that make the game a *chore* to play. For example, the in-game canyon intro cutscenes looked as smooth as FMV in 8.36, but some parts look like a slideshow in 8.37/8.38.

I was just about to finish that game, but now I don't feel like the hassle of the 8.36 drivers working again. :frown:

NFS:C might be having issues with Vista. It certainly looked awful and had similar stuttering on my 7900 GTO w/ all official WHQL and Beta nVidia Vista drivers even when settings were lowered sufficiently to bring the framerate above 40 fps. But then why did the HD2900XT w/ 8.36 drivers work so incredibly well? We're talking night and day difference here.

UPDATE: CCC "lost" lower 1.6:1.0 widescreen resolutions such as 1280x800, and I lose a bunch of resolutions at refresh rates of 59 and 75 Hz. Why are nVidia and AMD having so much trouble with DVI output?
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Overclocking with AMD Clock Tool is working with these 8.38 drivers. :D

I had the card up to 850/1000 MHz (a 14% GPU and 20% memory overclock) completely stable and artifact-free. I'm seeing 75/65C (GPU/PCB) max temps in normal gameplay and benchmarking w/ average temps about 3C lower. Good airflow FTW! The GPU hard locks Vista when set to 900 MHz, but I haven't tried pushing the RAM yet.

Performance increases only in certain situations with my overclock. 3DMark 05 increases from 11300 to 11500. :( I see some gains in FEAR, but only close to a linear increase when AA is turned up. I'm obviously CPU limited. Performance in FEAR at 1680x1050 med/high w/ stock clocks is 56/36/41% (0x/2x/4xAA) higher than my 7900 GTO in the same rig.

Here's something I didn't know about: in CCC, setting the AA type to Wide Tent (not the level) while keeping the level set to Application Preference lets the games use Wide Tent when you configure AA in the games' settings.

There's a huge performance hit going from Box to WT. I accidently left WT on (with CCC AA set to Application Preference!) and had to rebench a few times to keep my charts consistent. I wonder if some reviewers encountered the same problem? Or is this well-known in the AMD camp? We saw some really odd benchmark results on launch day.


UPDATE: I finally submitted a request to HIS (although their website is still quite slow). Sometime next week we'll see what they have to say regarding these "missing" 8-pin adapters.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nullpointerus
Overclocking with AMD Clock Tool is working with these 8.38 drivers. :D

I had the card up to 850/1000 MHz (a 14% GPU and 20% memory overclock) completely stable and artifact-free. I'm seeing 75/65C (GPU/PCB) max temps in normal gameplay and benchmarking w/ average temps about 3C lower. Good airflow FTW! The GPU hard locks Vista when set to 900 MHz, but I haven't tried pushing the RAM yet.

Performance increases only in certain situations with my overclock. 3DMark 05 increases from 11300 to 11500. :( I see some gains in FEAR, but only close to a linear increase when AA is turned up. I'm obviously CPU limited. Performance in FEAR at 1680x1050 med/high w/ stock clocks is 56/36/41% (0x/2x/4xAA) higher than my 7900 GTO in the same rig.

Here's something I didn't know about: in CCC, setting the AA type to Wide Tent (not the level) while keeping the level set to Application Preference lets the games use Wide Tent when you configure AA in the games' settings.

There's a huge performance hit going from Box to WT. I accidently left WT on (with CCC AA set to Application Preference!) and had to rebench a few times to keep my charts consistent. I wonder if some reviewers encountered the same problem? Or is this well-known in the AMD camp? We saw some really odd benchmark results on launch day.


UPDATE: I finally submitted a request to HIS (although their website is still quite slow). Sometime next week we'll see what they have to say regarding these "missing" 8-pin adapters.

looks like a CPU upgrade is in your future
i don't think i could handle 11k in 3DMark05 ... what is 06 like?

interesting ... thank-you for the reports

edit: BTW, OCZ just bought PC Power and Cooling

http://www.theinquirer.net/default.aspx?article=39879
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
My rig is CPU limited, and I was "time limited," so I ran '05. I'll probably get to '06 later tonight or in the morning. My '05 score was pretty close to yours ~9300 before this upgrade, if that's useful. Bang for buck--not worth it. I *may* sell the card since it was going for $450 on eBay last time I checked.

:Q

Oddly, with RAM prices the way they are (were?), I could probably sell my current mobo+CPU+RAM and upgrade to an inexpensive C2D setup for the cost of shipping. I'll have to check out some components and see whether that's feasible. Any advice? I'm looking for a board that's got Crossfire and passive chipset cooling (if possible).
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nullpointerus
My rig is CPU limited, and I was "time limited," so I ran '05. I'll probably get to '06 later tonight or in the morning. My '05 score was pretty close to yours ~9300 before this upgrade, if that's useful. Bang for buck--not worth it. I *may* sell the card since it was going for $450 on eBay last time I checked.

:Q

Oddly, with RAM prices the way they are (were?), I could probably sell my current mobo+CPU+RAM and upgrade to an inexpensive C2D setup for the cost of shipping. I'll have to check out some components and see whether that's feasible. Any advice? I'm looking for a board that's got Crossfire and passive chipset cooling (if possible).

that is *exactly* what i did ... i sold my P4ee for more than i paid for my e4300 ... which is a champion OCer ... i should get over 3Ghz in a "real MB" ... i am finding my PC3500 a bit difficult to sell, however, in FS/T [they are "cheap" beyond belief]

i believe there are ASrock MBs that will take your current CPU/RAM and also your HD2900xt ... but you will then be limited by the PCIe's 4x slot ... a little

so you need to make the jump to a "real" MB ... if you insist on xfire, i believe there is the new AMD MB due pretty soon - RD680 ... might be worth waiting for
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
3DMark '06 just crashed. There's some bug with R600 and a workaround--I'll get to it in the morning. *rubs eyes*

Yeah, I considered those DDR/DDR2 combo boards, but my problem is that I've got 4 GB in 4 DIMM's--and DDR is loads more expensive right now so I can actually make money upgrading my RAM. Besides, I've got great airflow and a PSU with some headroom, so it makes more sense to go with a board designed for overclocking the C2D.

I'll keep an eye out for that RD580; however, I don't want to wait too long because of RAM prices.

Crossfire or SLI--depends in which mood I am ATM. Maybe I'll get really lucky and some site will soon do a very comprehensive review of the new 7.5 Catalyst showing the HD2900XT being the better choice. At some point I want to go dual cards--I was considering X1950Pro because it was unusually cheap and effective (for dual card) at the time--just to see what it's like first-hand.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
that's what i would do ... [that's what i DID do] ... sell my DDR and get DDR2
--but in your case get a real OC'ing multi-GPU MB

good luck and as usual keep us updated ... i think lots of us are looking at "real world" experience with HD2900xt before doing anything ourselves

 

fern420

Member
Dec 3, 2005
170
0
0
Originally posted by: nullpointerus


Here's something I didn't know about: in CCC, setting the AA type to Wide Tent (not the level) while keeping the level set to Application Preference lets the games use Wide Tent when you configure AA in the games' settings.

There's a huge performance hit going from Box to WT. I accidently left WT on (with CCC AA set to Application Preference!) and had to rebench a few times to keep my charts consistent. I wonder if some reviewers encountered the same problem? Or is this well-known in the AMD camp? We saw some really odd benchmark results on launch day.

question, when i try to do this my box that allows me too chose the filter is grayed out, now i can choose tent then then go to application preference but the box for filter is again grayed out. are you saying if i chose tent before i chose application pref for the level the tent option will "stick" even though its grayed out once you pick application for the level?
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
I fixed 3DMark '06 with the following suggestion except that in Vista x64, the path has SysWOW64 instead of System32:
http://www.ocforums.com/showthread.php?p=5070830

The test was run at default settings w/ 8.38 RC7 Vista x64 at stock and overclocked, rig in sig.

HD2900XT Stock: 6768
HD2900XT 850/1000: 7999

My framerates in some *portions* of the graphics tests increased by more than 50% (i.e. 41 to 78) even though the GPU overclock was only 14/20% core/mem. Also, the total score increased by 18% even though the score includes purely CPU tests, but my CPU and system memory were at stock in both runs. Does this make sense?

UPDATE: HD2900XT 850/1000 & CPU @ 2.5: 8555 (odd, eh?)

Looking over my numbers in FEAR, the average performance increase was never more than the clock speed increase although I did not make any effort to compare *portions* of the benchmark run.

Anyway, I'd enjoy reading a driver comparison of 8.37 and 8.38 final when available.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: fern420
Originally posted by: nullpointerus


Here's something I didn't know about: in CCC, setting the AA type to Wide Tent (not the level) while keeping the level set to Application Preference lets the games use Wide Tent when you configure AA in the games' settings.

There's a huge performance hit going from Box to WT. I accidently left WT on (with CCC AA set to Application Preference!) and had to rebench a few times to keep my charts consistent. I wonder if some reviewers encountered the same problem? Or is this well-known in the AMD camp? We saw some really odd benchmark results on launch day.

question, when i try to do this my box that allows me too chose the filter is grayed out, now i can choose tent then then go to application preference but the box for filter is again grayed out. are you saying if i chose tent before i chose application pref for the level the tent option will "stick" even though its grayed out once you pick application for the level?
It's not grayed out in my CCC. I can change the filter regardless of whether the AA level is set to Application Preference or not.

Maybe this is a feature of the upcoming Catalysts? Or maybe it's just a bug in 8.38 RC7?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ROEHUNTER
Back to the OP, I noticed that the MSI 2900XT on the egg has the 8 pin adaptor included.

link ... please ... what combination of power-leads plug into it to give 150w?

edit: where?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127289

http://www.newegg.com/Product/ShowImage...ress+x16+VIVO+HDCP+Video+Card+-+Retail

2nd EDIT: i see "Power Cable" [finally]

2 molex connectors into an 8-pin, right?
--the picture is not that clear to see the other end on IE


3rd EDIT: Damn ... they are still running $410-$450 at NewEgg :p
 

ROEHUNTER

Member
Oct 26, 2004
110
0
0
Hmm , hard to see the 2 4pin molex to ?, but the other defenitley looks like a 6pin to 8pin adaptor.
But , even with that it will still just draw the same wattage as having just a 6 pin .
Only thing is it will unlock the ATI overdrive.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Originally posted by: apoppin
Originally posted by: fern420
Originally posted by: apoppin
but 2 6 pin and 2 8 pin for X-fire

so 4 adapters, right?

it doesn't make any sense . ... . a 6pin to 8pin is only supplying the SAME 75W


EDIT ... i won't bump this ... but i THINK i figured it out
... maybe not ... i should just buy one, right? :p

PCIe slot =75w
1x 6pin PCIE = 75W
1x 8pin PCIE = 150W [which can't just be a 6pin to 8 pin adapter or it would still be 75w]

... so 300w to OC it? 225w for just the 75w slot, the 2 x 6pin PCIE 150w
:Q

600w for 2 oc'd HD2900xts
:shocked:

and connectors and adapters hanging everywhere

tell me it isn't so

the math seem correct but it cant be a full 600 watts for two 2900's in xfire. that juicebox power supply i ordered was listed as 2900xt crossfire certified by amd and ati and its only 450 watts nominal but i didnt see a max rating, it could very well hit 600 watts on max. regardless i dont think that 600 watt number is very far off at all, id bet somewhere in the 500-600 range for two 2900's overclocked.

honestly, if you are going to do two 2900's in crossfire and you have a nice power supply that hasn't reached the end of its usefulness the best bet is to buy that juicebox i did for 100 bucks and have no worries about a zillion adapters and a rats nest.

great find by the way Neutronbeam01, i knew someone would make one but it still isnt going to put the correct watts to the card but perhaps will unlock the overdrive for people with nice beefy power supplies.

they need to make an adapter using two 75w connectors into an 8-pin 150w connector

as to your "booster" ... i believe "all" it needs to do is supply the "extra" needed for x-fire/OC'ing ... the two "extra" 8-pin connectors ... one for each card to supply the extra oomph for each card ... at least 300w ... your original PS supplies everything else for "normal" operation

honestly ... after reading all this ... i am probably backing away from HD2900xt ...:brokenheart:
i dunno ... it's gonna ALL depend on pricing ... for me
:confused:

of course, i'd have NO problem running 8800Ultra SLI on my current PS
---just on my pocketbook :p

I'm with you on that, i thought these cards would be great for mid range @ reasonable price, even if little noisy and power hungry. I even have a new 650w with 4 x 6 pin and 1 8 pin connector, however, I haven't had a lot of luck with ATI cards and after reading this, I gotta say, I'm afraid to go near these cards!....LOL