I do not know if you are properly saying what you are trying to say. THere is a common misunderstanding about power supplies newbies run into that is not an actual problem, but other parts of your post seem to describe an another type of actual problem. It is not clear to me which one you are talking about.
Are you talking about a current source or a voltage source? Do you actually need a fixed output current (not the same as current limiting) to produce a signal? Or are you just trying to power the PLC and it consumes between 4-20mA?
You can either choose the current or the voltage to be a fixed level and the other will vary in order to achieve this. You cannot choose arbitrarily choose both to be of any value for any load.
Voltage sources will force a voltage through the circuit...the circuit will draw however much current it needs to. THe current rating listed for the voltage source is the CAPACITY. As long as the current drawn is less than the current the voltage source is rated for, you will be fine. It will not try and force that amount of current into the circuit...THis is what you use to power almost something. It does not matter if you use a 1000A voltage source to power a device that draws 1mA because the 1000A is a capacity, not a current value that will be forced into the circuit.
A current source WILL force a certain amount of current into the circuit, it will force the a certain amount of current in the circuit, and the voltage will be increased to whatever it needs to be to force that current in. Of course....the voltage can only get so high and that limits how low a resistance you can connect to the current source and still produce a correct current output. This maximum voltage capacity is similar to the maximum current capacity- as long as the load requires less than the rated value, it will give a proper output. These have much more specialized uses.