LEDs need the current to be controlled, not the voltage. For instance, if you have an LED rated at 50 mA and 2 V, if you supply it with 30 mA (60 % of rating) it will be somewhat dimmer, and if you supply it with 70 mA (140 % of rating) it will be somewhat brighter and will probably burn out in a minute or two.
If you supply it with 1.2 V (60 % of rating) it will not light at all. If you supply it with 2.8 V (140 % of rating) it will explode.
So you need to supply a certain current, and if that is not perfectly accurate, it doesn't matter too much.
The simplest method is to use resistors in series. There are various calculators on the web to tell you what resistors to use, but do make sure that you design doesn't vary too much between 12 and 14 V, as that is the sort of change you get when the alternator starts working. For instance, if you have 5 LEDs in series, each rated at 2 V, then the voltage across the resistor will vary between 12 - 5 x 2 = 2 V and 14 - 2 x 5 = 4 V, so the current, and therefore brightness will double when you start the engine. The solution is to have fewer LEDs in series.
You can also use a current control circuit. That has the advantage that the current will be more constant when the supply voltage varies, and you can use more LEDs in series, which means fewer parallel strings so less current consumption.
There are also various LED driver circuits that use inductors to boost the voltage. That would let you run all 14 LEDs in series, and it would be efficient, and the current would be accurately controlled. It is the most complicated and expensive solution.
BTW, I found that my car turns off the alternator for a few seconds if the accelerator is floored rapidly. I guess this is to improve acceleration by removing load. It causes the voltage to drop by over a volt, and that is while driving. I certainly wouldn't want the lights to be varying lots as a result of that.