You are right that you will get better efficiency and LED by running the LEDs at less power, but it's hard to see a change of even 50%. When I've been involved in increasing light outputs, we are looking for a 4 times increase to be worthwhile when it's not a side by side comparison.
On the graphs shown, 3A will give a voltage drop of around 3.3 V, while 1.5 A will give maybe 3.05 V. That is an efficiency increase of 8.2% at 1.5A compared with 3 A
The light output at 3A is 325% compared to 700mA, while at 1.5A, the light output is 185% compared to 700 mA, so that is an efficiency increase of 13.8% at 1.5A compared to 3A.
The temperature rise will depend on what heatsink you have. To make a fair comparison, for the same light output, the heatsink would be the same whether you have two LEDs or one. The temperature resistance of the LEDs is 2.5°C/W, and there is about 10 W dissipated, so if changing to two will reduce the power per LED by 5 W and the temperature by around 12.5°C. That will increase the light output by a little bit, maybe 5%.
The total efficiency from reduced voltage, reduced temperature and reduced current is about 27%, but that is a rough estimate and isn't enough to be directly visible unless you had the two situations side by side.
The reduced power and temperature will increase LED life by a lot.
You don't necessarily have to put the LEDs in series. I know it's best to, but these LEDs change voltage quite a lot when the current changes, so as long as you keep the wire resistances similar for the two LEDs, they will generally be fine in parallel.
The downside is, of course, doubling of the price of the LEDs and they are about £1.50 in quantity. Also the optics have to handle light from two sources, not one, which may add complication, or reduce the peak lux.