Issues with ML Pattern Recognition After Bandpass Filtering

Status
Not open for further replies.

TarekSY

New Member
Hello everyone,

We've been working on a machine learning project for pattern recognition, using time-domain features such as kurtosis, mean, standard deviation, variance, skewness, and peak-to-peak values.

Background:

Initially, we trained our data after applying a high-pass filter at 1 kHz. The results were satisfactory.
Upon performing a spectral analysis last week, we discovered that our region of interest lay between 1 kHz and 3 kHz.
Issue:
When testing our pattern recognition system this week, the model's performance deteriorated significantly. Analyzing the data revealed a strong signal component at 8 kHz.

Steps Taken:

We decided to apply a bandpass filter between 1 kHz and 3 kHz to focus on our identified region of interest, expecting our time-domain features to be more relevant.
We trained a new model using the bandpass-filtered data.
However, the model's performance in recognizing patterns was not up to par.
As an additional experiment:

We applied the 1 kHz to 3 kHz bandpass filter on the dataset originally trained with 1 kHz high-pass filtering.
Yet again, we faced recognition performance issues.
We're somewhat puzzled as to why our ML system is underperforming after these filtering operations. Any insights or suggestions would be highly appreciated.
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…