PCI EXPRESS 2 - 8 pin adaptor?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Originally posted by: ROEHUNTER
Hmm , hard to see the 2 4pin molex to ?, but the other defenitley looks like a 6pin to 8pin adaptor.
But , even with that it will still just draw the same wattage as having just a 6 pin .
Only thing is it will unlock the ATI overdrive.

Thats what I see, I certainly havent seen a 2 x moles to 8 pin (150w), if a 6 pin 75w takes 2 molex, then surely 2 molex wouldnt be enough for the 8 pin 150w?!
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: SolMiester
Originally posted by: ROEHUNTER
Hmm , hard to see the 2 4pin molex to ?, but the other defenitley looks like a 6pin to 8pin adaptor.
But , even with that it will still just draw the same wattage as having just a 6 pin .
Only thing is it will unlock the ATI overdrive.

Thats what I see, I certainly havent seen a 2 x moles to 8 pin (150w), if a 6 pin 75w takes 2 molex, then surely 2 molex wouldnt be enough for the 8 pin 150w?!

what is that connector ... sure looks like 2 molex to .... 6-pin ... 8-pin ... something
-we WILL find out for sure ...

and *agreed* SolMiester ... even for a former ATi fan [me] it is a little "scary" requiring 300w just to OC it ... maybe they should have included a power brick instead
 

ROEHUNTER

Member
Oct 26, 2004
110
0
0
Originally posted by: SolMiester
Originally posted by: ROEHUNTER
Hmm , hard to see the 2 4pin molex to ?, but the other defenitley looks like a 6pin to 8pin adaptor.
But , even with that it will still just draw the same wattage as having just a 6 pin .
Only thing is it will unlock the ATI overdrive.

Thats what I see, I certainly havent seen a 2 x moles to 8 pin (150w), if a 6 pin 75w takes 2 molex, then surely 2 molex wouldnt be enough for the 8 pin 150w?!

Yea what is the deal with that. There are both (2x4pin molex to 6pin PCI-E) and (1x4pin molex to PCI-E). Both kinds are readily available. So does the 2to1 give more watts than the 1to1 or not. I've seen both kinds bundled with vid cards.

And the one that came with my X1900GT was a (1x4pin Molex through connector to PCI-E)

 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
I believe the 2to1 can give more watts IF the two molex connectors are on different rails. Drawing more than the rated power from just one rail is relatively inefficient. IIRC this is why we don't just *add* multiple rails together when determining the real 12V amperage of a PSU.

I *think* you can draw as power as you want from a single molex connector as they're all tied into the same power output, but the actual circuitry inside the PSU may overload and kill itself. Different PSUs implement the same specs in different ways and could conceivably allow 150w to be drawn from a 1to1 6-pin PCIe to 8-pin PCIe without adverse effects.

But I'm NOT a PSU expert, this is just a layman's attempt at piecing stuff together into a theory. ;)

If you want to find out about these adapters and whether they're safe, the CPU/overclocking forum has some people who are *very* knowledgeable about the inner-workings of PSU's. These people should be able to give us a solid answer.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i am sure glad you said that and not me, nullpointerus !
-i have been speculating far too much

but it sounds reasonable ... it *appears* the 8-pin is to "insure" it is coming from another rail ...

and i am not so sure it is actually 150w instead of 75 needed ... i mean there is 75w from the PCIe rail + 150 more watts coming from the other two connectors = 225w ... pretty close to its spec .. the 8-pin i guess "insures" the OC

i am just not going to spend $450 to find out

of course the number of rails don't "really" matter ... you could just have one humongous 1500w PS with ... what 40a [whatever is 'max'] on a single 12v rail that might be just as stable as one with separate rails ... as i understand it ... most PSes don't really have separate rails

and now i see OCZ bought a really high-end PS company
:Q
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
OK, after some time spent *cough* researching *cough* driver performance in Oblivion, I think I've discovered a serious problem with the 8.38 RC7 drivers. In CC I have forced the AA level so that I can run Oblivion with HDR on. The "AA forgetfulness" bug (from 8.361) seems to be back, but I think it only resets AA to Application Preference *after* quitting the game once. On all subsequent reloads of the game, the drivers completely forget that AA is supposed to be forced on. To get AA working again, I must load up CCC, change the AA level to something else, and then change it back to what I really want.

(This is all assuming that my Catalyst installation wasn't hosed from the installer madness w/ removing 8.361.)

UPDATE:

I cannot seem to reproduce this "AA forgetfulness" bug. Perhaps it only happens after a reboot?

Also, I've encountered a new, unrelated bug: polygon seams are very visible in certain areas. It's not everywhere, just minor objects here and there. The most noticeable are the seams in the large objects such as stairs and towers. Here's an Oblivion screenshot showing lots of small yellow seams in a single large rock.

This "polygon seams" bug occurs regardless of whether AA is enabled or disabled. It *sometimes* shows up extremely visibly in the CCC AA preview window as white seams for no apparent reason, and, when the bug is triggered in CCC's 3D preview, lesser amounts of AA exhibits far more of these seams than greater amounts of AA do. Enabling temporal AA may or may not eliminate the seams completely--at least for the moment. I believe the bug can be triggered in CCC by moving from the AA page to the AF page and back again, then changing some settings.

This latest problem is either a driver bug with whatever is doing AA or a hardware problem. :(
 

ROEHUNTER

Member
Oct 26, 2004
110
0
0
I just loaded up the 8.38RC7's in Vista and no change in 3d06, no change in BF2142, and Lost Planet Demo is still all messed up.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
I heard from HIS:

Response from HIS Support Department

Dear Sir,

There is no 2x4 pin adapter cable available. Due to the high power
consumption of the card, you have to use a power supply originally equiped
with the 2x4 pin plug.

A list of recommended power supply can be found in the following link. You
can choose one with 2x4 pin plug.

http://ati.amd.com/products/certified/powersupplies.html

Thanks.

Best regards,


HIS Support
 

fern420

Member
Dec 3, 2005
170
0
0
just thought id add on here. i got that juice box power supply today, the one approved by ati for a 2900 crossfire set up with the proper 8 pin connectors and low and behold.....SAME FING PROBLEM!!!!!

im so pissed now im about to just returning both of these 2900 cards. my number 2 card will not go into 3d clock speeds, it sits at 2d speeds all the time, not only does the amdgpuclock tool indicate its not going into 3d speeds that card is also almost 20 degrees cooler than the other after gaming, also a good sign that truly isnt going into 3d clock speeds.

i know it isnt a power issue or the cards, i even tried using the existing 6 pin pci plugs from my main power supply on the cards and only using the two 8 pin connectors from the juice box and its the same thing. im giving the ati customer support one more try at solving the problem but it looks like i may be returning or selling these. such a shame too because the single card performance was so good but if anyone is in the market for a 2900 let me know, i will most likely be offing these for just what i paid.

im just at my wits end here and it seems like customer support doest even know, they assured me it was a power problem and once i got a ATI supported power supply i would not have this problem. i did find this in the driver release notes as a problem with 2900's in crossfire in the most current 2900 drivers but i did try at just 1024x768 and its still doing it.

* Setting the display resolution to 1600x1200 or above on a system containing an ATI Radeon? HD 2900 XT no longer results in the card using 3D clock settings

does anyone know someone else with a 2900 crossfire setup and can they check and see if both of their cards are showing full 3d clock speeds even while in a game? mine lets me make the change in amdgpuclocktool but once i go into game it slams the number two card down to 2d clock speeds. the heat is a dead give away, if someone with a crossfire setup could just look at their card temps and see if they are the same or very close or are they significantly diffrent.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
That's terrible! I'm sorry I can't help you with your crossfire setup directly.

Can you tell me whether your 3D driver settings in CCC keep resetting? I think this problem occurs when the computer goes into sleep mode. I did a clean install of Vista x64 and Catalyst 8.37.4.3, same problem. The AA and AF settings just revert to Application Preference even though CCC says they're not. These drivers are really bugging me. I'd switch to ATT, but I want to see if this stuff is resolved in the next official driver.

Hey, maybe you could switch to ATT and disable the ATI service that controls 2D/3D clocks?
 

fern420

Member
Dec 3, 2005
170
0
0
Originally posted by: nullpointerus
That's terrible! I'm sorry I can't help you with your crossfire setup directly.

Can you tell me whether your 3D driver settings in CCC keep resetting? I think this problem occurs when the computer goes into sleep mode. I did a clean install of Vista x64 and Catalyst 8.37.4.3, same problem. The AA and AF settings just revert to Application Preference even though CCC says they're not. These drivers are really bugging me. I'd switch to ATT, but I want to see if this stuff is resolved in the next official driver.

Hey, maybe you could switch to ATT and disable the ATI service that controls 2D/3D clocks?

no, my ccc settings stick just fine, its the 3d clockspeeds that keep resetting to 2d speed for me after i change it to the proper speed with the amdgpuclocktool but only on card number two. im not exactly sure how i can disable the ATI service, any suggestions?
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
1. Reboot into safe mode.
2. Start, Run, "services.msc" without the quotes.
3. Find ATI External Event Utility and set its startup type to Manual.
4. Reboot into normal mode.
5. Your cards will not automatically go into 3D clocks, so use an overclocking utility to get them there.
6. Does the second card o/c fine like this?

It's just an idea--I wouldn't waste a *lot* of time on it if you can't get it working. Some other ideas:

Have you tried any other drivers?
Have you tried switching the cards' slots?
Have you tried the second card by itself, in the primary PCIe slot?

IOW, isolate the problem: is it a driver bug, or faulty hardware?


BTW, you got one of those 5.25 PSUs, right? Is it loud or quiet, hot or warm?


I'm also having a problem with an unknown NTativrv01 device showing up in Device Manager. Media Center appears to work fine aside from an initial BSOD. I think this device only affects the capture parts of the card although the missing driver could conceivably be an explanation ast to why most deinterlacing/upscaling modes listed in CCC Avivo Quality aren't working properly yet.
 

fern420

Member
Dec 3, 2005
170
0
0
ok did some testing today and spent about an hour on the phone with ati customer care, even their level 2 techs are about as knowledgeable as a best buy employee, hehe. they were absolutely no help, he didnt even offer any suggestions just kept saying "i dont know" so after that phone call i decided to try a few things, i disabled crossfire then went into the amdgpuclocktool and set my 3d clock speeds to what they should be, not 500 and 500 for card two. i then enabled crossfire and loaded up a game. to my surprise when i alt and tabbed out both of my cards were showing full 3d clock speeds in amdgpuclocktool although the one is still allot hotter and i dont even have the side on my case, it has massive airflow yet one card is still reaching 90 degrees under load, see pic:

Clock Tool Screenie

now thats all dandy but when i exit the game, bam the 3d clock speeds get reset to 2d in amdgputool but, i can now change them to the correct setting and open a game and they stick. the only downfall is i have to run this little procedure every time i boot my machine but i am showing full 3d clock speeds for both cards in game. im almost thinking this is an issue with amdgpuclock tool and perhaps i was getting 3d clock speeds all along but the program was reporting them wrong because even with both cards showing 3d clock speeds my 3dmark06 score did not change from what i had before when i assumed only one card was going into 3d clock speeds.

the new cat is supposed to be out today so i hope it can help with this issue but let me just say that contacting ati customer support is literally a waste of time.



edit



yea i think its just amdgpuclocktool that doesn't show the proper 3d clocks because this time after a reboot i touched nothing opened gpuclocktool and looked, sure enough number two shows 2d clock speeds across the board, i expected as much but when i loaded a game both cards jumped to 3d clock speeds even though card two was set to only go to 500 and 500 in 3d. i think this whole time ive been having issues with amdgpuclocktool but it still doesn't explain why that number one card is so much hotter than number 2.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Just throwing out some more ideas:

Could the uneven heat be due to something else, like airflow in your case?
When you put your hand in back of the case, are you feeling the right areas?
Is the top card's fan sucking in the heated air from the bottom card's VRU's?
If you have just one card in, what are your temps?
 

fern420

Member
Dec 3, 2005
170
0
0
Originally posted by: nullpointerus
Just throwing out some more ideas:

Could the uneven heat be due to something else, like airflow in your case?
When you put your hand in back of the case, are you feeling the right areas?
Is the top card's fan sucking in the heated air from the bottom card's VRU's?
If you have just one card in, what are your temps?

after more messing around i do believe it to be an airflow issue. these cards have the intake directly on the top and with my bad axe board the cards are stacked right on top of each other, not even a 1/4 inch between them. i just think my card number one is starved for air, i stuck a couple of fans facing it and it dropped some but theres just no room to get airflow to that number one cards intake. any word on after market coolers yet?
 

mruffin75

Senior member
May 19, 2007
343
0
0
Originally posted by: fern420
Originally posted by: nullpointerus
Just throwing out some more ideas:

Could the uneven heat be due to something else, like airflow in your case?
When you put your hand in back of the case, are you feeling the right areas?
Is the top card's fan sucking in the heated air from the bottom card's VRU's?
If you have just one card in, what are your temps?

after more messing around i do believe it to be an airflow issue. these cards have the intake directly on the top and with my bad axe board the cards are stacked right on top of each other, not even a 1/4 inch between them. i just think my card number one is starved for air, i stuck a couple of fans facing it and it dropped some but theres just no room to get airflow to that number one cards intake. any word on after market coolers yet?

To test this, couldn't you swap the cards around? (card in slot 1 goes to slot 2, card in slot 2 goes to slot 1)..

If the temp. follows the card, then it's a card problem, if the card that *was* cool, is now hot... then it's an air-flow problem..?
 

bfdd

Lifer
Feb 3, 2007
13,312
1
0
These 3dmark05 and 3dmark06 scores you're posting are kind of depressing =\ I expected more
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: bfdd
These 3dmark05 and 3dmark06 scores you're posting are kind of depressing =\ I expected more

When you look at the scores, you must consider the other components in the computers. For example, I have pointed out that I am CPU limited. The HD2900XT generally does very well in 3DMark from what I have heard.
 

bfdd

Lifer
Feb 3, 2007
13,312
1
0
Nice, looks like the CPU made quite the difference. Similar overclock to what I have.
 

fern420

Member
Dec 3, 2005
170
0
0
Originally posted by: bfdd
Nice, looks like the CPU made quite the difference. Similar overclock to what I have.

i was pretty impressed with the single card scores, i kinda wanted more from the crossfire but im hoping as soon as both cards can be overclocked and the drivers improve that it will break 16K in 3dmark06. i could not for the life of me get a single GTS to break 12k in 3dmark06, it just barley broke 11k only by a few points with the same over clocking but everyone seems to claim the GTS stomps the 2900, i dont think any of them actually tried it, they are just joining the troll fest.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: fern420
Originally posted by: bfdd
Nice, looks like the CPU made quite the difference. Similar overclock to what I have.

i was pretty impressed with the single card scores, i kinda wanted more from the crossfire but im hoping as soon as both cards can be overclocked and the drivers improve that it will break 16K in 3dmark06. i could not for the life of me get a single GTS to break 12k in 3dmark06, it just barley broke 11k only by a few points with the same over clocking but everyone seems to claim the GTS stomps the 2900, i dont think any of them actually tried it, they are just joining the troll fest because it doesn't.

of course, everyone made up their minds and judged HD2900xt to be crap when the DT preview was published ... and HardOCP really screwed the perception with their Worst review ever ... i am not going back to that BS site anymore ... i guess nvidia "bought" them.

ANYWAY, we DO see AMD drivers getting better and better very quickly while progress is at a dead stop for performance improvement with nvidia drivers ...

HERE is a really strange new review ... it shows the HD2900xt blowing away the GTS in everything but games where it's own drivers are obviously immature ... and catching up to the GTX:

http://www.legionhardware.com/document.php?id=650
*ALL @ 1920x1200*

Company of Heroes
8800GTX : 76.6
X2900XT : 63.4
8800GTS :57.4

FarCry
8800GTX : 140.5
X2900XT : 133.4
8800GTS :123.1

FEAR
8800GTX : 72
X2900XT : 55
8800GTS :53

Lost Planet
8800GTX : 40
X2900XT : 37
8800GTS :27

PREY

8800GTX : 139.5
X2900XT : 141.6
8800GTS : 99.8

Supreme Commander
8800GTX : 44.9
X2900XT : 30.0
8800GTS : 34.8

STALKER
8800GTX : 99.1
X2900XT : 94.6
8800GTS :84.6

X3
8800GTX : 88.4
X2900XT : 109.2
8800GTS : 67.8

so whaddaya think ... is the HD2900XT a "dark Horse" as i originally posted ... or should i just get the GTS?
:confused: