0

I am trying to normalize my sensor data set. I would get prefect results if I get the min and max values of my data set.

(sensor-sensor.min())/(sensor.max()-sensor.min())

However, my sensor data is real-time and it would not be possible to have the min and max values of the data set. I haven't had much luck finding a pre-existing algorithm (in python) for this, but perhaps I'm just not looking in the right places. Does anyone know of one? Or have any ideas?

maciek97x
  • 2,251
  • 2
  • 10
  • 21
Jason
  • 27
  • 4
  • 4
    How about normalizing it using min and max of the data received so far? The accuracy will get better over time. You can also normalize it using the range of a sensor, i.e some sensors read data as integer from range [0, 1023]. – maciek97x Sep 30 '22 at 12:05
  • I have already tried this method, the range of the sensor value is changing every time .... – Jason Oct 01 '22 at 21:43

0 Answers0