I started a thread at another forum, but the community hasn't helped so I am going to paste one of my posts below, it is a differentiator circuit I need help on to detect harmonic distortion:
Okay after a few days of expirimenting and trying to dial in components I simulated a differentiator last night, a simple one, the schematic is shown in the attatchment below.
Let me go over my idea as clearly as possible again but be specific and rule thigns out:
First, I will not consider comparing the output to the input at all, why? Because that is simple and has been done over and over, my concept is different. I will be designing a product to sell on a market. Another guy created a product that only requires connecting aligator clips to the output terminals of a car audio amplifier and running test tones. I will need to only connect the amp output to this device so it needs to detect harmonics.
How am I wanting to do it? As said in original post, I will be using a differentiator to give me an output depending on frequency. Only harmonics will be high enough in frequency to trigger it. I am trying to deisgn it so that 20khz(or close) and below will not give output from the differentiator in the design. I will be creating about 4 LED channels to be able to detect harmonics from the headunit, and 3 other amplifiers since the average car audio setup can use upto 3 amps on average for the different frequency ranges such as subwoofer amp, mid amp, and trebble range amp.
My circuit will use VCC and GND for the op-amp power supply so that all my output from the opamp will be above GND to be friendly with an LED...Well, I will probly send the op-amp output to a comparator to compare the out voltage to a reference voltage since the circuit won't be perfect and will still produce some voltage even at 20khz, just a little. That comparator output will also saturate so It may work good. I can drive the LED from that comparator. I may use a peak detector(diode and capacitor)to take the ac output from the first opamp and turn it into a DC voltage.
Rightnow I am just testing and testing to try and get my real world results close to my simulation results. If I can get the differentiator to saturate or get close due to harmonics created measured by my oscilloscope then I will be a happy!
couple more details:
I am first testing on my 80w x2 caraudio amplifier. I am simple paralelling my circuit to one of the two channels. I am using a voltage divider to bring the max amp output before clipping to aroung 800mV to 1 Volt. My amp output before clipping is about 17V.
My opamp power supply is 12V from my headunit's turn on REM lead which dosn't worry me because this circuit will draw practically no power. Giving that the opamp will swing about 9V my virtual ground created by my monopole opamp application should be about 4.5V with "no" differentiator output.
I am testing with a 100hz tone, a 1khz tone, and a 20khz tone. The first two tones should yield about 4.5V where the frequency is too low to give any differentiator ouput. 20khz will be the same or close, maybe a tad higher. Since I will have multiple LED's and sections to detect different frequency ranges my design limitations open up a little I think. Also, that 200-300V maximum requirement I stated int eh original post changed since not all frequency ranges will be near that at most. The higher "trebble" range LED section will have a max possible power of about 1000Wrms or so, so the voltage swing isn't near the oginal requirment(I know that sounds rediculous but I want it to be compatible and there are people who actually use that much power just for their tweeters but it's because they use a whoole lot of them in show vehicles).
One last thing, I will want the device to be able to work during music play unlike the product the other guy has that I was refering to earlier.
For now I will be focusing on only the trebble LED channel and I could use some advice for the circuit. My circuit I am testing now is very simple it is shown below.