0

my dataset (patient No., time/millisecond, x, y, z, label)

1,15,70,39,-970,0
1,31,70,39,-970,0
1,46,60,49,-960,0
1,62,60,49,-960,0
1,78,50,39,-960,0
1,93,50,39,-960,0
.
.
.

i am trying to to use the spectrogam for x-axis signal in preprocessing stage to use it then as the input data for a machine learning model instead of using the original raw x-axis data

here is what i tried to do

import matplotlib.pyplot as plt
import numpy as np

dt = 0.0005
t = np.arange(0.0, 20.0, dt)

data = np.loadtxt("trainingdataset.txt", delimiter=",")
x = data[:]

NFFT = 1024       # the length of the windowing segments
Fs = int(1.0/dt)  # the sampling frequency

ax1 = plt.subplot(211)
plt.plot(x)
plt.subplot(212, sharex=ax1)
Pxx, freqs, bins, im = plt.specgram(x, NFFT=NFFT, Fs=Fs, noverlap=900)
plt.show()

it gets me the following error

Warning (from warnings module):
  File "C:\Users\hadeer.elziaat\AppData\Local\Programs\Python\Python36\lib\site-packages\matplotlib\axes\_axes.py", line 7221
    Z = 10. * np.log10(spec)
RuntimeWarning: divide by zero encountered in log10
Hadeer El-Zayat
  • 281
  • 5
  • 20

1 Answers1

0

If x is your signal and you can assume that your sampling rate is the mean of time/millisecond, then probably you can use the librosa library to compute the mel-spectrogram using librosa.feature.melspectrogram, there's also other utils to compute signal related features.

Diego Aguado
  • 1,604
  • 18
  • 36