Is there a maximum amperage difference depending on how you use an old atx power supply? I’ve searched this forum but couldn’t find anything. If I’ve overlooked it, I sincerely apologize.
I’ve got an old 20-pin 350 watt atx power supply. The maximum output amps are 15a @ +12v, 35a @ +5v, and 20a @ +3.3v.
For instance, for the +5v, which method (power board vs DIY) will get me close to the max 35a? My power supply has eight(8) red or +5v wires, with the 20-pin atx plug/connector having four(4) of them. I’m not an electrician, but the way I understand it, with eight +5v wires, each wire would be capable of handling roughly 4.3-4.4 amps each.
Now, with the atx power board, I’m assuming it uses the 4 red/+5v wires within the atx connector so that the +5v output would handle roughly 17-18 amps. With the DIY method and the terminal block, all 8 wires would be used, even if it’s 2 bundles of 4 wires each twisted together and crimped with an U shape terminal and then screwed on to the terminal block. By using all 8 wires, wouldn’t that be closer to utilizing the 35a maximum?
35a is a LOT of amps and I’m nowhere near needing to use that much. I’m just trying to understand the difference between the 2 methods of using an atx power supply and which one, if either one, will handle the most amps.
Again, I apologize if this has been discussed before.