I have four LED's that are 1.35V and 100mA. They need to be powered off USB, so 5 volts. Normally I've used a current limiting resistor to
protect things, but I can also see that 4 x 1.35 is more than the supply voltage.
So first thing, is a current limiting resistor required if I run them in series? If it's safer to have a resistor, what combination of series and
parallel would be best, and how do I calculate the value of the resistor? I have a decent range of resistors, but they're all 1/4 or 1/8 watt so
I'll need to keep the overall power down. It'll also save my USB port too!
try this
http://led.linear1.org/led.wiz
quote:
Originally posted by ReMan
try this
http://led.linear1.org/led.wiz
Worth noting that the more you can put in series the better as the total power wasted in the resistor will be lower, however production variation can limit this (when you get to 5+ is series you can see serious variation across them). In this case I'd be going 2x2
Agreed
In fact, depending on these leds, you may get away with 4 in series and no resistor
quote:
Originally posted by ReMan
Agreed
In fact, depending on these leds, you may get away with 4 in series and no resistor
Just be wary that with no limiting resistor, if one led goes into thermal runaway you will pop the supply.
quote:could put a 250ma fuse in series
Originally posted by coyoteboy
Just be wary that with no limiting resistor, if one led goes into thermal runaway you will pop the supply.
quote:
Originally posted by ReMan
Agreed
In fact, depending on these leds, you may get away with 4 in series and no resistor
quote:
Originally posted by gremlin1234
quote:could put a 250ma fuse in series
Originally posted by coyoteboy
Just be wary that with no limiting resistor, if one led goes into thermal runaway you will pop the supply.