Most amateur radio transceivers and ham radio go-boxes are using improperly sized power cables. Using the correct wire size or gauge in your DC power cables will increase performance and decrease risk in your ham radio setup.
Picking wire gauge can be a challenge. 12 volt DC power can drop greatly over distance, so choosing the right wire gauge makes a big difference. Plus, the greater the current we draw from the wire, the bigger the resistance and voltage drop will be. The general rule for DC power is to keep the voltage drop under 3% for the length of your cable. If the drop is greater than 3%, then choose a larger wire. This will properly power your equipment and keep the current draw within the specifications of the wire. Overloading a wire is inefficient at best and dangerous at worst.
So how do we know what size wire to use? Simply stated, Voltage Drop is current multiplied by the total resistance of the circuit conductors. You know, Ohms law. (VD = I x R). In order to make this calculation you’ll need to know the total resistance of the circuit which is made up of the factors of wire size or gauge, total length of the conductor, and the resistivity of the wire material.
The easiest method to determine voltage drop, without having to measure resistance with a meter or looking up values in tables, is to use one of the various online calculators that will do the math for you. One of my favorites can be found at rapid tables dot com. I’ll put links to these calculators in the video description.
Basically, in how to use the calculator, you will select your wire type, size, cable length, current type, voltage, and maximum amperage. The calculator will tell you the voltage drop, percentage of drop, and the wire resistance.
Say, I want a 6 foot power cable for my transceiver. If I select 12 gauge wire, enter the length, – DC current, – 13 volts (which is the nominal voltage of my battery) – and 20 amps, which is my maximum current draw, I get a voltage drop of .38 volts or 2.93 percent. That’s just under the 3% target, so a 12 gauge wire would be the proper size.
If I would change the wire size to 14 gauge, the voltage drop percentage climbs up to 4.7% which is outside the recommendation. I’d either have to shorten the cable or use it in lower amperage situations.
For a couple of real life examples, in my camping trailer I’m using 12 gauge wire from my solar panel to the charge controller. The cable run is about 14 feet long and the 200 watt panel puts out close to 12 amps. At a 17 volt open circuit, this is slightly over the 3% power drop (I honestly should be using 10 gauge wire), but close enough to be within safety tolerances. I also have a 100 watt portable solar panel and with that I use about 20 feet of 14 gauge wire to run from the panel to the charge controller. Since that panel has between 5-6 amps of output at 17 volts, that length of 14 gauge wire is within the 3% tolerance.
For powering equipment, the amperages are much higher, so I will use a larger cable. My power connection cable from the battery to the transceiver are all 12 gauge. This will handle 23 amps at 13 volts which is the maximum current draw of my FT-891 at 100 watts transmit. While I usually transmit at a lower power, I did the calculation for the max power I would generate.
In my vehicle, I ran 10 gauge cable from the battery to the 50 watt mobile transceiver. I used 12 feet of cable, but at that length it’s rated for 40 amps which gives me capability for two transceivers without having to add a second cable. For battery to radio connections, it’s ok to be oversized. Just make sure you are properly fused for the current load of your cable.
Now it should be noted that in commercial photovoltaic solar systems, installers use a slightly different metric and will strive for no more than a 2% drop in voltage in their DC circuits. This has been a hard and fast rule for a very long time and can be traced back to the days when solar cells were expensive and copper cheap, so spending a bit more money on copper wiring was a relatively inexpensive way to increase system efficiency. Now that solar cells are much cheaper, studies have shown that modern solar systems can handle a greater voltage drop without a correspondingly greater loss in efficiency.
What does that mean for the average ham wanting to use some solar to charge their go-box battery. The 2% voltage drop goal is a good one to meet in order to optimize performance, but in reality, the 3% limit is easier and cheaper to achieve and will still reduce resistive heating losses.
So to recap: Voltage drop in a wire is due to resistive heating from wire size, length, and material. DC power cables should be sized to carry the expected current draw of the load. Longer cables will require a thicker conductor. And, in order to minimize resistive heating, you will want a voltage drop of no more than 3 percent.
Like what you see? You can leave me a tip:
Become a patron! Unlock exclusive content at: https://www.patreon.com/kb9vbrantennas
Support Ham Radio Q&A by shopping at Amazon: http://amzn.to/2kO6LH7