I have time series data which are measured in miliseconds (ms) (69300 rows) and I want to apply low-pass, high-pass and band-pass Butterworth filters.
My approach is the following:
- Convert ms into Hz
- Find the Nyquist which is sample rate/2 (as sample rate I take the converted Hz value)
- Calculate the sinusoid+noise
- Calculate the cutoff frequencies for low-pass and high-pass filters (by taking the 0.1 of the total Hz and divide with the Nyquist value for the low-pass, and taking the 0.25 of the total Hz)
- For the band-pass filter I calculate the difference of the cut-off frequencies
- Apply the -nth order of the filters
- Passing the filters with the sinusoid+noise.
Below is a code snippet I made using R:
# 69300 ms are 0.014430014430014Hz
x <- 1:69300
nyquist <- 0.014430014430014/2 # sampling rate/2
x1 <- sin(2*pi*RF*0.014430014430014) + 0.25*rnorm(length(RF))
# 0.014430014430014 Hz sinusoid+noise, RF is the time series metric
f_low <- 0.001443001/nyquist # 0.1 of total Hz divided by Nyquist
f_high <- 0.003607504/nyquist # 0.25 of total Hz divided by Nyquist
bf_low <- butter(4, f_low, type="low")
bf_high <- butter(4, f_high, type = "high")
bf_pass <- butter(4, 0.3000001, type = "pass") # f_high - f_low
b <- filter(bf_low, x1)
b1 <- filter(bf_high,x1)
b2 <- filter(bf_pass,x1)
Is this the correct approach? Should instead of sinusoid+noise to apply the filter to the metric itself?