I have some signals (preprocessed trials of sEMG signal, to be precise), each one stored in one vector. My goal is to detect and remove any possible signal outlier from these set of signals.
Moreover, I don't know a priori any model that represent a precise trend for this set. I can only calculate it by myself based on the trials I've got.
Consider now an example of 6 trials (see image below):
I would the trial in orange to be marked as outliers.
Until now I've worked with Matlab in order to reach my goal, using the median of all trials plus or minus the mean absolute deviation: basically, I calculate the median between the signals point by point using the Matlab function median and then the mean absolute deviation using the matlab function mad. In this case a signal is considered outlier if more than, say 50%, of the signal is out of the, call it safe zone, formed by the median plus or minus on or two times the mean absolute deviation (see image below).
Do you know best method to resolve such task?
EDIT:
A further implementation of the same topic above is to use the Matlab function alignsignals to align the 6 trials in order to improve the precision of the above method (example in the image below):