Why wouldn't I use a 270ohm resistor in this case to keep 20mA fwd current?
You don't know the actual forward voltage of your LEDs unless you measure and label all of them. The forward voltage is somewhere from 2.0V to 2.4V but most will be 2.0V.
If you calculate a current-limiting resistor for LEDs that have a forward voltage of 2.4V then the current will be higher when they are 2.0V. You do the simple calculation to see if they will burn out.
How is it handled then when 36 leds @ 2.4V max fwd voltage rating = 86.4? 12.6V - 86.4V = -73.8?
Of course not.
If you have more than 5 LEDs in series then they will not light from a 12.6V supply.
I showed that 3 LEDs in series allow the current to be reasonable when the LED voltage is 2.4V or 2.0V.
I can set up a max 5 parallel strings with 7 in series (I'll just use 35 LED's in this example); then I would have .6V residual current to dissipate requiring a 30 ohm resistor on each parallel string?
The same problem as one hundred replies ago:
1) Calculating with 2.4V LEDs.
7 in series total 16.8V. 20mA in 30 ohms is 0.6V so the power supply is 16.8V + 0.6V= 17.4V.
But if all 7 LEDs are actually 2.0V then their current will be 113mA and
they will instantly burn out.
2) Calculating with 2.0V LEDs.
7 in series total 14V. A 30 ohm current-limiting resistor has 0.6V at 20mA so the power supply is 14.6V.
But if all 7 LEDs are actually 2.4V then
they will not light.
This arrangement would put 7 LED's in series. If one light went out would the remaining 6 go out?
When LEDs fail, they usually go open, they do not short.
So all the LEDs in a series string all stop lighting when one burns out because they are in series.
If you buy quality LEDs from a Name-Brand manufacturer then they will be reliable.
But if you buy Cheap Chinese LEDs on E-Bay then some will probably not work in the beginning and the remaining will probably fail soon.