From what I've read, I used to think that LEDs have a constant votlage drop across them at all times. People or texts will say things like "This diode has a drop of X volts".
But I realized from a little experimenting yesterday that this is not really so.
For example my multimeter on the diode setting measures a 1.58V drop for a generic red LED I have. But when I put it in a circuit and power it up, at only 10ma, I measure a voltage of 1.98 across it. This is a huge discrepency. Is there some equation I can use to predict the voltage drop of an LED when driven at different currents?
For the multiplexed LED display project I am working on, I need to figure out what kind of resistors I will need for pulsing these LED. I am aiming for 160ma pulses. If I get the supply voltage at about the right level, could I do without resistors entirely? Seems like at high enough current, there will be enough drop over the LED to just even things out on it's own, if that makes any sense.
But I realized from a little experimenting yesterday that this is not really so.
For example my multimeter on the diode setting measures a 1.58V drop for a generic red LED I have. But when I put it in a circuit and power it up, at only 10ma, I measure a voltage of 1.98 across it. This is a huge discrepency. Is there some equation I can use to predict the voltage drop of an LED when driven at different currents?
For the multiplexed LED display project I am working on, I need to figure out what kind of resistors I will need for pulsing these LED. I am aiming for 160ma pulses. If I get the supply voltage at about the right level, could I do without resistors entirely? Seems like at high enough current, there will be enough drop over the LED to just even things out on it's own, if that makes any sense.