Sunday, March 27, 2016

Hacking GPU PCIe power connections


Until recently, I never thought much about PCIe power connectors.  Three 12 power and three ground wires was all I thought there was to them.  I thought it was odd that the 8-pin connectors just added two more ground pins and not another power pin, but never bothered to look into it.  That all changed when I got a new GPU card with a single 8-pin connector.

My old card had two 6-pin connectors, which I had plugged a 1-2 18AWG splitter cable into.  That was connected to a 16AWG PCIe power cable, which is good for about 200W at a drop of under 0.1V.  My new card with the single 8-pin connector wouldn't power up with just a 6-pin plug installed.  Using my multi-meter to test for continuity between the pins, I realized that it's not just a row of 12V pins and a row of ground pins.  There was continuity between the three 12V pins, and between three of what I thought were five ground pins.  After searching for the PCIe power connector pinout, I found out why.
Diagram edited from http://www.overclock.net/a/gpu-and-cpu-power-connections

Apparently some 6-pin PCIe cables only have 2 12V wires, 2 ground, and a grounded sense wire (blue in the diagram above).  With just two 12V wires, a crap 18" 20AWG PCIe power cable would have a drop of over 0.1V at 75W.  Since the 8-pin connector has three 12V pins, it can provide 50% more power.  My 6-pin 16AWG PCIe cable would have voltage drop of only 40mV at 75W, so I just needed to figure out a way to trick the GPU card into thinking I had an 8-pin connector plugged in.  The way to do that is ground the 2nd sense pin (green in diagram above).

I didn't want the modification to be permanent, so soldering a wire to the sense pin was out.  The PCIe power connectors use the same kind of pins as ATX power connectors, and I had an old ATX power connector I had cut from a dead PSU.  To get one of the female contacts out of the ATX connector, I used a hack saw to cut apart the ATX connector.  Not pretty, but I'm no maker, I'm a hacker. :-)  I stripped the end of the wire (red in the first photo), wrapping the bare part of the wire around the screw that holds the card bracket in the case.  I powered up the computer, and the video card worked perfectly.

Looking for a cleaner solution, I decided to make a jumper wire to go between the sense pin and the adjacent ground.  I also did some searching on better ways to remove the female contacts from the connectors.  For this, overclock.net has a good technique using staples.  When the staples aren't enough to get the contacts out, I found a finish nail counter-sink punch helps.

Here's the end result, using a marrette (wire nut) to make the jumper:

UPDATE:
See my related post Powering GPU Mining Rigs.

17 comments:

  1. Awesome! I'll give it a try. Thank you :)

    ReplyDelete
  2. Thanks for the info on the pinouts, also the link to the staple tip! Resolved a couple of issues for me. Actually, though, the 8-pin connector provides only 50% more power; that is 150% as much as the 6-pin. People get into trouble a lot when using percentages for comparison.

    ReplyDelete
    Replies
    1. Good eye. I've been meaning to correct that typo for ages, and finally did.

      Delete
    2. You're wrong. PCIe 6 pin connector is rated up to 75W whilst 8 pin connector is rated up to 150W so it provides two times the power (200%, 100% more).

      Delete
    3. You're actually wrong about the connector. Read my response below from Nov 18. The PCIe standard de-rates the power into the 6-pin connector, even though each pin is physically identical to the pins used on the 8-pin connector. Since the 8-pin connector has 3 pins for 12V and the 6-pin connector has 2, the 6-pin connector can handle 2/3rds (67%) of the power that the 8-pin connector can.

      Delete
  3. Funny because certain configurations I have been coming across at work prompted me to look this up. We have end-users that will often times purchase GTX 1080s to stick into our machines. And these are typically Precision 5810 and 3600 models. And the last guy that I seen set one of these up plugged in the 6 pin connector and I told him it's probably not going to post because it requires an 8 pin connector and these Precision's only come with a 6-pin connector, by default. We decided to try it and it indeed displayed and worked.
    Now, I don't know that it's performing optimally and these are developers and Engineers that do leverage the power of these cards, which is why they have them...but is there any way to tell if these are being leverage the full capacity? I would imagine that if the 6 pin connector wasn't going to work you wouldn't get any display, or poor performance and I've heard no complaints.

    I read an article stating that you can indeed deliver 150 watts over 6 pin connector, but 6-pin connectors are only rated at 75 Watts. So does that mean that there are circumstances where a graphics card can draw 150 watts of power over six pins? Or was that just a hypothetical notion?

    ReplyDelete
    Replies
    1. The 75 watts for a 6-pin is just a standard; they are supposed to be designed with thick enough wires and pins to handle 75W without overheating. Lots of them can easily handle 100W or more. Some cheap crap 6-pin power cables might even melt with 50W, especially if the wires aren't sized correctly:
      http://nerdralph.blogspot.ca/2016/06/when-does-18-26-when-buying-cheap-cables.html

      Delete
    2. Hi, so I did some research on the matter and found your findings to be perfectly in line with what i've found elsewhere and already knew. As a general rule PSU 12v rails can do 18A, this equates far more than a mere 150 watts that 8 pins are rated for. I know you know that already though, wouldn't recommend running either at 20A but if you, for instance, did a bios hack on a gpu to further overclock it you could easily add 5-20 watts per 12v rail and be fine :)

      Delete
    3. Like Ralph said it's nothing more than a standard and partially a MASSIVE safeguard, you could more than likely pull 150 watts on your average PSU's 12v with a 6 pin and be fine, 5-20 watts over 75 would be more than safe if you were interested in a slight bios hack. NEVER pull over 75 watts on the PCI-E Slot though, as far as I can tell that WILL burn out the slot.

      Delete
  4. Hi Raplph,

    Nice post, I'm struggling with a reverse problem: in eGPU forum, owners of Aorus Gaming Box that is equippied originally with a Geforce 1080 GTX are swapping this enormous GPU to less powerful card; i.e.: 1060, 1050 and even AMD cards. The problem is that the eGPU box senses 8 pin connector, and it will refuse to boot with card that uses 6 pin or even no additional power connector.
    It seems that we need to add some load on this 8pin PCI-e and sense jumpering, from your pinout diagram I'm thinking in (please give me your thoughts):
    - Jumper pins 7-8
    - Add a 12volt fan to add load to the pins 1-6 all to the fan.
    More info about the problem:
    https://egpu.io/forums/psu-cables/power-up-rx-580-mini-itx-with-aorus-gaming-box/paged/2/

    Best regards and thank you

    ReplyDelete
  5. Hi guys,

    very informative indeed. I would have a question though, as I want to be 100% I won't fry anything by bridging those 2 pins of the graphic card. I have an Asus Turbo White 970 and I want to put it in a Hp Z600 workstation, which comes with a proprietary PSU -650W- and has only one 6 pin output (and a Molex). I read a lot about people using the 6 to 8 adapter and I have one myself, but I would rather go with this setup as the mentioned adapter has a suspect thin gauge..would it work?
    Thank you in advance

    ReplyDelete
    Replies
    1. Are you suggesting there is something special about the Asus 970's power connector?
      If not, then it would be no different than what I did with the Gigabyte R9 380 card you can see in the photo.

      Delete
  6. Just a tiny little edit: First pin-out diagram is a 6+2 pin PCIe, not a 8 pin PCIe. The difference is in the round/square socket configurations. 8 pin PCIe would have socket #8 as square.

    ReplyDelete
  7. Just purchased an RX 590, and to my surprise I lost the goddamn 8 pin cable, I reached this page, modded an 8 pin extension cable that was laying around, proceed to make a jumper cable out of it, and boom, now I can game with just 2 / 6 pin connectors. Thanks... :)

    ReplyDelete
    Replies
    1. Then it can't be running at full capacity the extra 2 wires should be fed back into the other ground wires but I am not sure if the pining matters there a 6 to 8 pin adapters and if you look at the pictures of these you always see the 2 extra wires feeding back into the other ground wires.I bought A dualfan 6gb EVGA 1060 and I don't have the 8 pin so I haven't installed it and I don't think this is a proper solution. I am ordering a new PSU it really suxs I pent so much money getting this card was so excited and now I can't even use it yet.

      Delete
  8. OK so I bought a new video card and it's an 8 pin and I have a 6 pin so I started hunting for a solution so figured I'd simply but a 500 watt power supply but what I noticed is that the 500 watt supply has a 6 pin with an extra 2 pins separated those two wires simply lead back into the 6 pin connection. I have a 350 watt PSU and I only run 2 sata hard-rives off it which is why I never bought a new one I don't see any reason as to why I can't copy what this power supply does or what the 6 to 8 pin connectors do the problem I have Is I can tell if the 2 extra wires have a specific slot. https://www.ebay.ca/itm/Rosewill-Valens-500W-Gaming-Power-Supply-80-Gold-Certified-FREE-SHIPPING/173894581280?hash=item287cecbc20:g:dl0AAOSwpUVcz4BX

    ReplyDelete
  9. I had to debug in the same way my radeon 5500xt , at the end i connected the six pin pcie to the gpu and the leftover pins I connected them together , but I figured it after looking at a 8 pin pcie connector that had the pin that is in the row with the yellow 12v cables connected to the ground. My card with 4gb of memory , with the computer , after measurements with a multimeter , takes about 300w while running a deep stress test sw on my gpu

    ReplyDelete