If you're building micrcontroller projects that require a maximum of 500ma or so (for the entire circuit) then there is no problem using a 1A power supply or a 500A power supply.
If the circuit is designed correctly it will work exactly the same.
Now if you're developing breadboard circuits, its advisable to have a current limited supply. I.e. you can set a trip point at say 0.5A. If you make a mistake or a component goes short circuit then it will get warm/hot and maybe release a small amount of "magic smoke". If you whack a full 500A through a short then things tend to go bang/pop/kaboom/catch fire etc.
I regularly use a 10A 60V adjustable power supply with my prototyping but have the current regulation set to 0.1A. If I make a mistake or something goes wrong, the power supply just cuts out.
Its more fun when designing stuff for cars that take 30-40 amps at 13.8V. Mistakes become much costlier ...........
Don't confuse what current your circuit will draw with what your power supply will provide. Current isn't like voltage in this respect.
For example, if you had a bulb rated at 0.5A 12 volt, hook it up to a 12 volt power supply rated at 100 amps and it will work fine. It will still draw 0.5A
However hook it up to a 24 volt power supply and it will not take 12v, it will take the full 24 volt and will burn out.