I have a signal that consists of samples stamped in micro seconds. The signal's frequency is 200Hz. In time, the signal looks like this:
and spectrum (using Fourier transform) looks like this (the figure shows spectrum up to 3000Hz, for clarity):
The max frequency peak should be at 200 Hz, but since my signal shows a nonlinear trend in time domain, the dominant frequency is at 10 Hz, which is wrong. SO, how can I detrend this kind of signal before passing it through Fourier transform? I tried simple things such as substracting subsequent samples and using numpy's polyfit but it didn't help. Thanks!