Unlikely. If you use a DC-DC to output a constant voltage, then as the input voltage falls to it, the input current will increase causing the input voltage to fall quicker. The DC-DC convertor will also heat up in the process wasting more energy.
i think i dont agree with WTP pepper because he speaks in general about the dc-dc converters while im thinking about the specific one i've posted
Pepper's statement, though general, DOES apply, because it is true of ALL power conversions,
including the part you are looking at.
Converters do not increase POWER, they only convert between different voltage and current ratios, but the output power can never be greater than the input power. Input power will always be the output power plus the inefficiency of the conversion.
So, if the output voltage and current load of your converter stays the same, then the input
power required will stay the same. If the input voltage from your supercapacitor is falling, then that means that the input current will need to
increase to maintain the same input power.
Next you need to look at the conversion efficiency, and how it changes with different operating conditions. For that, let's refer to the chart on the bottom of page 2 of the datasheet you posted. I don't know the output voltage and current you need, but I'm going to pick 100mA @ 3.3 volts for the sake of this discussion.
While the voltage at the cap is 2.5V, the efficiency is about 92%, which is pretty good. But as the cap voltage falls to 1.2V, the efficiency is down about 78%. The next line on the chart (Vin=0.8) doesn't go out to 100mA, so your system would probably shut down when the cap voltage is someplace not far below 1.2V. And, as the efficiency drops, the converters operating temperature will increase, since that is where the heat of the inefficiency is dissipated.
Nigel. Yes, we do need Shipstones to power our world. To bad Robert was a great writer, instead of an engineer. May he Rest in Peace,