Hello there,
In any meter design it is also customary to use two back to back diodes to protect against over voltages of either polarity.
Since the voltage divider resistors are there already in most cases, the current to the diodes will already be limited, but if not a series resistor can always be added which will then limit the current to the diodes.
A 1k resistor at 1000 volts will draw 1 amp, so that's probably not good enough. A 10k resistor will draw 0.1 amps at 1000 volts, so that's probably better even if you dont intend to measure 1000 volts, but it would also be a lot of power in the resistor, but then again the voltage divider would have a higher resistor anyway. 100k at 1000 volts would draw 0.01 amps at a power of 10 watts, so that may be doable At 500 volts it's only 2.5 watts, and at 250 volts it's 0.625 watts.
I'll leave all these details to you though.
In current meters it is nice to use as low a resistance as possible to measure current, but there is also the thought about going a little higher for the low current ranges because larger value resistors dont drop much at very low currents. A 1 ohm resistor only drops 1mv measuring a 1ma current level, and even a 10 ohm resistor only drops 10mv. I'll leave the further details to you again.