This is my first post. Wow.
I am using an IR LED who's beam radiates outwards, reflects off an object, and returns to a photo transistor. The LED is square wave pulsed at 600µs intervals for 60µs. The transistor collector is tied to Vs = 3.6V, and its emmitter tied to a 68mH inductor (with 200Ω series resistance) and in turn to ground. This is to prevent ambient, direct light, from saturating Vce, while giving a nice ringdown pulse across the inductor (the capacitance across the CE region forms a series LCR circuit). My problem is this:
Ambient light (sunlight, incandescent or torch) amplifies this pulse quite significantly. I don't mean 50Hz hum from the incandescent light or DC drift from the torch light, I mean the 60µs pulse increases in amplitude. I don't want this. Possably:
1- The detector is contained within a non-reflective, open ended enclosure, and I think the ambient light is making the surfaces more reflective.
2- The ambient light introduces a quiescent current in the transistor of about 200µA, while the IR pulse introduces about 50µA. Fiddling with 2nd order differential equations showed that it is the change from the quiescent current to the quiescent plus IR current that causes a voltage across the inductor. The initial ambient current has no bearing on the matter.
3- The small signal CE resistance (Ic vs Vce) for different ambient Ic values is not significant.
4- Some wierd physical phenomina is causing some of the ambient light beams to additivly interfere with the IR beams, causing them to increase in amplitude, or the photons are turning into dark energy, instantly appearing at the other end of the universe, and instantly increasing each photons energy here?
Any help would be muchly appreciated, thanks.