Skip to main content

I know Voltage * Amps = total wattage a transformer can put out.  My question is, does the wattage go down as you decrease the voltage of the transformer?  So for example, you have a transformer rated at 180 watts (18 volts, 10 amps).  If you set the transformer to 12 volts output, are you then only getting 120 watts since 12 * 10 = 120?  Is this correct?

 

I always assumed a 180 watt transformer gave you that amount of wattage regardless of how many volts you set the transformer to put out.

Original Post

Replies sorted oldest to newest

Generally, wattage output goes down roughly proportionally to voltage.   If it puts out, say, a maximum of 180 watts at 18 volts it will put out usually no more than about 90 watts at 9 volts.  There are two reasons: 1) generally transformer output is sort of linearly related to voltage whether a traditional design or electronic - that is just how they work.  2) the maximum rating is generally just under the breaker rating of the power supply.  For example, my Z4000 activates its "breaker" (red light goes out and its shut down the voltage) at just a smidgen over ten amps.  Thus it can put out about 18 volts at 10 amps, but not more than 10 amps at 9 volts - even if it could, the breaker would open at 10.1 amps.  A CW80 is the same way only it is about 5 amp breaker limit and only goes to 16 volts or so at 16 V (16x5=80). 

    

Post
×
×
×
×
Link copied to your clipboard.
×
×