i have gone to using LEDs for all my lighting on my layout. using them for all my buildings and signage ect. i had read that using a power supply for computers works well. i will be installing over 350 LED and wanted the advise of anyone that has done this, how well it works, how many LED's can you install on one power supply zand any other bugs or trip ups there would be to using them?
To be clear, have you already selected and installed candidate LEDs for whatever building styles you have? And you are now asking about "scaling up" to 350 LEDs?
In round numbers a single LED used for building lighting consumes about 50 mW (0.05 Watts). So 350 x 50 mW = 17.5 Watts which is well within the capability of a computer supply which is typically capable of delivering hundreds of Watts.
For 350 LEDs, I'd think it quite the chore to assemble/solder 350 resistors and 350 "loose" LEDs.
Or maybe you've already discovered LED strips which can be powered by either 5V or 12V DC...which are 2 voltages available from any computer power supply. The resistors are already mounted on the strips. Typical 12V and 5V strips shown above. For 12V strips you cut the strip in multiples of 3 LEDs...apply 12V DC and you're done. For 5V strips you cut the strip in multiples of 1 LED...apply 5V DC and you're done. No need to fuss with selecting resistors. Dimming or adjusting brightness is by adjusting the voltage for which there are options.
IMO, first choose the LED type/style/packaging that works for your buildings. THEN select the power supply.
Whether on a strip or loose, expect to pay, say, a nickel per LED. A computer power supply in the hundreds of Watts is maybe 5-10 cents per Watt.