zachtheterrible
Active Member
:lol: this is driving me nuts. ive never been able to figure this out:
say that ive got a 10v source. and ive got an LED that has a forward voltage of 2v (so therefore i need to have only 2v going to it). how do i use ohm's law to calculate the value of resistor that i need in order for there to be 2v @ the LED? take a look @ my picture.
i can figure out voltages in a series of resistances but this . . . :evil:
thanx
say that ive got a 10v source. and ive got an LED that has a forward voltage of 2v (so therefore i need to have only 2v going to it). how do i use ohm's law to calculate the value of resistor that i need in order for there to be 2v @ the LED? take a look @ my picture.
i can figure out voltages in a series of resistances but this . . . :evil:
thanx