Isn't it true that the ultimate test for decoder capacity is total
stall current?
If all motors stall for some reason and draw their max current, is it right to assume that it makes no difference whether they were wired in series or parallel?
That being said, I've had good luck using smaller decoders in O scale, but realize that their use must be limited to shorter trains, no steep grades, etc. In other words, I believe careful operation is the most important factor in decoder survival when it is operating near its limits.
However, all it takes is that one derailment that causes an engine or a car to wedge itself in a position that causes the motors to stall, but doesn't cause a decoder-saving short that kills power and "saves the day".
If the operator doesn't sense this situation immediately, they are likely to "give it the gas" and the decoder is toast. By setting the motor control CVs, you can minimize this likelihood. Check the speed curve programming CV values for that particular decoder to determine how to limit its max speed which would help protect itself and the motors.
These CV values can also be tweaked to minimize those jack rabbit starts.
By the way, talking of decoders, this all assumes that Chris is using DCC, which usually uses a constant AC track voltage, not DC. As in any command system, motor speed is
not being controlled by changing track voltage. I don't know much about the theory behind controlling motors by waveform changes, but I believe some previous responses and suggestions in this thread did not take this into account.
Jim