first of all you need to know what the intended rail voltage and current requirements are. for instance, a 100 watt per channel stereo amplifier requires +/- 50V @ 12A (approximately). so you would need a transformer with a center tapped secondary of about 71 Vac @12A. you would need to add a bridge rectifier that's rated at about 200V and 25 to 40A (when the power supply is first turned on, the bridge rectifier charges up some capacitors, and the charging current can be a lot more than what the amplifier itself uses), and a pair of electrolytic capacitors, with a minimum value of 4700uF @ 85V. increasing the microfarad value will decrease your ripple, but increases the initial charging current when the amplifier is turned on.
this is just one example. it depends on what the power supply is for. for a small audio device using op amps, you might only need +/- 12V @ 1 amp. for a big amplifier, you might need to go in the direction i used as an example.