I have a pair of 5V power supplies that are each capable of providing 6.3A. I am wiring them in series so i will have a 10V 6.3A power supply. I want to connect a few circuits to it but i do not want more than 500mA going through each circuit. So would i put a 20ohm, 5 watt resistor on the return side of each circuit that i only wanted 10V and 500mA going through?
What circuits are you powering?
Generally, you don't need to limit the current, a circuit will draw as much current as it needs because its resistance will only draw a certain current, for a given voltage, see Ohm's law.
The only time current limiting i required, is to protect the power supply and wiring against a short circuit. If it's a ready made PSU, the chances are it has over-current protection built-in, if it's a battery then a fuse can be used to limit the current.
If the the resistance stays the same and the voltage drops, the current will drop as well right?
Adding a resistor will drop a voltage proportional to the current, see Ohm's law again.
If the circuit needs 500mA@10V its resistance will be 10/0.5 = 20 Ohm, so if you put a 20 Ohm resistor in series with it, the voltage will drop to 5V so you might as well have used a 5V power supply in the first place, look up potential divider.
This is theoretical because some loads are active which means their resistance changes, e.g. the resistance of a radio will vary depending on the volume and the sound, in which case adding a series resistor can cause it to malfunction.