Hello again,
Actually you must have the output go high enough to supply the voltage to the linear regulator that will be required to meet the maximum output spec of the entire power supply. So if your power supply has to put out 20 volts, then your switcher has to be able to put out (roughly) 23 volts, but if you prefer you can have the switcher go all the way up to 30 volts. This has to work at low input line too, so that might be 15 percent less of the usual nominal line input voltage, so that means you always need the capability of being able to get more output from the switcher than you really need normally.
The catch of course is that at high line you have a lot of extra voltage that you have to deal with. The whole ball game then becomes that of control. The circuit has to be able to control the output no matter what. This means of course a slow start circuit is required (which you already know) and the time constant has to be long enough to allow the rest of the circuit to catch up. This also means some sort of secondary control might also be a good idea, to catch the output if it starts to go dangerously high...like above 25 volts or something. This could come in the form of another error amplifier or some other trick to force a faster response to cut back the pulse width quickly.
I've seen higher power converters take out very expensive computer system power supplies because they didnt have this kind of safety feature, although it does really depend on the design whether or not it needs it or not. One way to tell is to actually try it in simulation by increasing the input line voltage quickly or removing the load quickly and see what the switcher does. If the switcher EVER forces more than 30v across the linear regulator then there's going to be a problem. You'll have to correct that by design sooner or later.