Assuming you are using an analog input to the controller and that 0 volts is a valid in range value, then a typical way to detect a 'loss of sensor' is to bias the input with a very high resistance to a 'top of scale' voltage, so called 'upscale burnout' and have the software logic detect this higher then full scale value as an invalid measurement. That means you will have to define in your software what the highest valid value will be and bias the input to something higher then that but still within the A/D measurement span. You will have to research the impedeance of the microcontroller's input and the sensor's output impedenace to determine the bias resistor's value.