You haven't stated the range of distances you're attempting to measure. The speed of light table below is from Wikipedia and will help put things in perspective.
Light will travel 1 meter in ~3.3 nano-seconds – 3.3 x 10^-9 seconds. So to measure 1 meter, you must transmit a pulse, respond to the pulse with the receiving end to transmit a return pulse in some fixed, known time, sense the received pulse and measure Δt with nano-second resolution.
A microprocessor running at 20 MHz has a clock cycle of 5 x 10^-8 seconds....
Looks like a problem.
The speed of sound in air is 343 meters/second, roughly a million times slower than the speed of light. A bit more practical to measure I should think.
First quantify what you're trying to measure then worry about the hardware.
What distance are you trying to measure?
What is the needed resolution?
Will the proposed method work for this? What resolution do you need on the time measurement? Is this practical to measure using whatever skills you have?
Only after the above questions are answered, start to select the hardware.
I think you're going to find that this approach isn't entirely practical. Maybe it is, but you need to understand the theory first.
To put things in perspective, the round-trip time for an RF signal to/from a target at 100m is ~600nS. To measure that sort of distance to the nearest metre would require a time resolution of ~6nS; i.e. you would need to be able to distinguish between two events only 6nS apart.
To put things in perspective, the round-trip time for an RF signal to/from a target at 100m is ~600nS. To measure that sort of distance to the nearest metre would require a time resolution of ~6nS; i.e. you would need to be able to distinguish between two events only 6nS apart.