Skip to main content

I am attempting to ascertain the difference of current draw (in amps) for a twin motored engine. I know if the motors are wired in series, one gets better slow speed performance. What I'd like to know: is there a difference in current draw between parallel and series wired motors? Short of tearing the engines down and measuring, I hope someone may know if the different wirings result in different current draws. Whether one wiring draws less than the other?

Thanks.
Original Post

Replies sorted oldest to newest

Whether parallel or series, for a given engine speed and load-torque each motor sees the same motor voltage and current. Let's say each motor of a twin-motor engine needs 10V and 1 Amp to run 10 MPH pulling 10 cars.

The motor drive circuit powering the series configuration supplies 20V at 1 Amp. The voltage splits and each motor gets 10V and 1 Amp.

The motor drive circuit powering the parallel configuration supplies 10V at 2 Amps. The current splits and each motor gets 10V and 1 Amp.

The motors in either configuration receive the same electrical power (10 Watts) and convert it with the same efficiency to mechanical horsepower. The motor drive circuit supplies the same electrical power (20 Watts) in either configuration but by using a different voltage and current - which has implications on the electrical components and track power transformer.
OK, I'll drop the other shoe.
I'm attempting to power 4 vertical can motors with 1 NCE408 decoder, which has a running rating of 4 amps and a momentary rating up to 8 amps. I do not want to overdrive the decoder rating under normal operation with a long train. If I can reduce current draw with either series or parallel wiring of the motors, that is added insurance.
I see it a little different. For DC circuits, and motors have a little different twist, V=IxR. For series circuits R is additive. For parallel circuits, it is an inverse relationship.

Given that the transformer is set to a given voltage let's say 16 volts. Lets say each motor is 4 ohms.

In a series circuit, the 16V is dropped across both motors. Since their resistance is the same, each motor drops 8Volts. Each motor get's 2amps of current.

V=IR so 16V=Iamps x (4ohms+4ohms) or 16/8 =2amps. SInce they are series, the 2 amps passes through both motors.

In parallel connection the circuit effective resistance is 2 ohms. Each motor sees/drops 16 volts. So each motor sees 16volts divided by it's 4 ohm resistance, or 4amps. Since each motor gets 4amps, the total circuit current is 8amps.

This is why a locomotive with series motors will move slower at 16 Volts then parallel motors. They have half the current going to them for a given voltage.

So back to your original question and assuming your transformer has a voltage limit, the series motor wiring will see a lower amperage load on the decoder. But the engine will also move slower and have a lower top end.

Think of 2 6 volt light bulbs in parallel and then series. If you apply 12volts to the series bulbs, each will glow normally and see 6 volts. Put them in parallel and each bulb sees 12 volts, burn very bright until they burn out. G
quote:
Originally posted by AG:
But, does the decoder suppose to fix this problem?


The OP got his question answered so he's apparently good to go.

But I found your question most interesting. So comparing 4 motors in series vs. parallel driven by a fixed-voltage decoder, there's a 16x difference in starting current and 4x difference in available starting torque which could be a problem getting a long train started. Our engines don't have gear shifting transmissions but what about a decoder with an electronic transmission that switches between parallel and series drive of multi-motor engines. It would add maybe $5-10 in cost per decoder. Maybe include wheel slip detection.

Of course the answer may be to buy a higher Amp decoder (More power Scotty! as Capt Kirk would say) but in the spirit of a forum discussion...
Last edited by stan2004
quote:
there's a 16x difference in available starting torque


?? The torque of a DC can motor is directly proportional to the current in the armature. Since the series-connected motors will have half the current in each motor (compared to parallel connection), the total torque at a given track voltage will be cut in half. The maximum available torque will also be cut in half at the maximum track voltage.

Series-connected motors can give smoother starts for locomotives such as Williams that jack rabbit start with the minimum voltage of 6-8 volts that most transformers start off at. The downsides are lower maximum pulling power, and a major loss of pulling force is one motor of the series string slips.
Dale and Stan, you raise an interesting point. Am I willing to sacrifice pulling torque for a lower current draw? I realize at speed, torque is less of an issue, but negotiating the 2+ percent grades our clubs has (one in a helix, which equates to close to 3 percent!) will require all the torque the motors can muster.

It appears testing and measuring the units under operating load, wired each way, is in order.
quote:
Originally posted by Dale Manquen:
quote:
there's a 16x difference in available starting torque


??


Thanks Dale, I modified my original post. What I meant to say was with a fixed track voltage DCC decoder, all 4 motors in parallel vs. all 4 motor in series would take 16x the starting current. The starting torque would only be 4x different. OK, I think I got it right this time. Oops

Rob's 2x2 series-parallel configuration might be the best compromise.

Chris, you should be able to look up (or just measure) the resistance of your motors - the datasheet will give the stall current at some specified voltage. Then, with the 16-18V DCC track voltage less a volt or so for the decoder bridge, you can determine which configurations are even in the ballpark wrt your decoder's current ratings.
Another way to split the motor voltage if the track voltage is AC in is to use 2, 6 amp diodes instead of a bridge rectifier. Half wave each motor with opposite halves of the sine wave. To do this you may have to replace or jump out the bridge in the E unit with the 2 diodes. Not sure how practical this is with some E units. Unlike series wiring the voltage to the motors will always be the same while each motor receives half the effective RMS voltage.

Dale H
quote:
Originally posted by Dale H:
Another way to split the motor voltage if the track voltage is AC in is to use 2, 6 amp diodes instead of a bridge rectifier. Half wave each motor with opposite halves of the sine wave. To do this you may have to replace or jump out the bridge in the E unit with the 2 diodes. Not sure how practical this is with some E units. Unlike series wiring the voltage to the motors will always be the same while each motor receives half the effective RMS voltage.

Dale H


That's a pretty clever idea. I think the old "Eldon" lane-changing slot cars used to work that way -- AC power source to both lanes with diodes in the cars and controllers.
Isn't it true that the ultimate test for decoder capacity is total stall current?

If all motors stall for some reason and draw their max current, is it right to assume that it makes no difference whether they were wired in series or parallel?

That being said, I've had good luck using smaller decoders in O scale, but realize that their use must be limited to shorter trains, no steep grades, etc. In other words, I believe careful operation is the most important factor in decoder survival when it is operating near its limits.

However, all it takes is that one derailment that causes an engine or a car to wedge itself in a position that causes the motors to stall, but doesn't cause a decoder-saving short that kills power and "saves the day".

If the operator doesn't sense this situation immediately, they are likely to "give it the gas" and the decoder is toast. By setting the motor control CVs, you can minimize this likelihood. Check the speed curve programming CV values for that particular decoder to determine how to limit its max speed which would help protect itself and the motors.

These CV values can also be tweaked to minimize those jack rabbit starts.

By the way, talking of decoders, this all assumes that Chris is using DCC, which usually uses a constant AC track voltage, not DC. As in any command system, motor speed is not being controlled by changing track voltage. I don't know much about the theory behind controlling motors by waveform changes, but I believe some previous responses and suggestions in this thread did not take this into account. Smile

Jim
Last edited by Jim Policastro
quote:
If all motors stall for some reason and draw their max current, it makes no difference whether they were wired in series or parallel.


I don't understand your statement. Are you implying that there is a separate active current limiter on each motor, not just the applied voltage and motor resistance at work?

The stall current of a DC can motor is equal to the applied voltage divided by the armature and brush resistance. Assuming the applied voltage remains constant, if you have two motors connected in series, the stall current for that string will be half the value(s) for a single motor. If you have two strings of two series-connected motors each, the stall current will be
2 strings x 1/2 the stall current for a single motor = the stall current of a single motor.

If the DCC controller has a separate setting for max stall current, then this will change.
Last edited by Dale Manquen
quote:
Originally posted by Dale Manquen:
The stall current of a DC can motor is equal to the applied voltage divided by the armature and brush resistance. Assuming the applied voltage remains constant, if you have two motors connected in series, the stall current (and the stall torque) for that string will be half the value(s) for a single motor.


Hi Dale,
So in the series configuration, the stall current is half that of a single motor. But there are two motors each driven by half the current and providing half the torque. What's an easy way to explain why those two motors, (supposedly working on the same team) don't sum their half-torques to deliver full torque so to speak? Thanks for your insight.
This has certainly been a lively discussion and I do appreciate the insight by those better versed than I.
As I referred to in the beginning, I wanted to ascertain the current load on a 4amp rated DCC decoder controlling 4 motors. Track voltage is AC which becomes DC after the decoder. If I understand how the DCC decoder operates, it rectifies the AC to DC and varies voltage to the motor.

What I've gleaned from this is series wiring does lower current draw (in amps), but at the expense of the torque (read horsepower) of each motor. Whether having 4 motors is enough is compensate for loss of torque remains to be seen.
One thing you can do as a crude test is place a slow blow fuse at say 2 amps in the circuit. Need to be careful of starting current. Run the engine in parrallel and see how it does when it hits your grade. Then move up to a 3 amp and 3.5. If they blow, you are going to have an issue. If not, continue to use the fuse as an added protection for your board.

I am not sure how expensive your decoder is. The HO ones I have seen are $24.95 to $34, but I am sure there may be more expensive ones out there. G
I guess my main question is how the fact that the motors are being controlled in a command environment affects the calculations. How does Ohm's Law apply, if it does, in the command environment with it's changing waveforms?

...and, yes, the DCC CVs can be adjusted to limit performance of the motors. But, I'm not sure if this can be interpreted as "limiting the stall current" in an extreme overload condition as I described.

A further question I've always wondered about is how you would measure circuit variables with the various command systems. For example, is a measurement of AC current supplied to the track with an AC meter what we care about as far as decoder "health" is concerned?

Or is it really still the DC current, measured with a DC meter between decoder and motor, the critical measurement? ...or should both really come out to be the same measurement of current if only the one engine is operating at the time?

Life was so easy electrically, if not operationally in a way, before command control and their "interesting" waveform changes. This was brought up, but not really conclusively explained, in a recent thread about using ammeters and voltmeters in a command environment. Smile

Jim
Jim,
In a DCC environment, the hazardous load condition should occur between the decoder and motor(s). Since input voltage is constant AC from track to decoder, a current overdraw condition should be on the DC output side. The exception may be in a derailment short, pulling max amps from the DCC booster and overloading the input side.

...or someone can correct me if I'm mistaken.
Originally Posted by gunrunnerjohn:
I suspect the reason that series motors would give better low speed control would be that you'd have to change the input voltage twice as much to get the same change in torque from the motors.

However, running with TMCC with cruise control, I never gave this much thought...
Post
×
×
×
×
Link copied to your clipboard.
×
×