0

I am using a Polar Equine Heart beat detector along with a micro-controller. The Polar device will give a pulse for each heart beat. But the output of that is not stable. If the electrode is not placed perfectly then the output will pulse unusually with high rate leading to high heart rate reading. The error pulsing is in the heart rate per minute range itself so its difficult to differentiate whether its actual or not. And also these error pulsing may get stable to correct reading after sometime.

But it will take around 10s. But we cant wait till 10s all the time, since most cases we get correct reading from starting itself.

Is there a good algorithm to find out the heart rate per minute which can intelligently remove the error pulsing and something that locks to a common rhythm of pulsing rather than just blindly taking all pulses?

0xAB1E
  • 721
  • 10
  • 27
  • Sounds like you want some type of smoothing, or throwing out values that change too quickly. One way would be to keep a running average. So if you have one minute of 90 bpm (i.e. 3 pulses every 2 seconds) and all of a sudden you start getting 3 pulses per second, you probably have an error. But the running average will attenuate that. You'll still get an increase, but not a huge spike. – Jim Mischel Dec 12 '14 at 04:52
  • I am using one like that now, in which i wait for first 5 beats and then i do continous averaging for each pulse recieved. But if the initial values are having an error then it will take around 30 beat for that value to come back to actual one, till that time the displayed value will be an error. But if we can eliminate the first error values completely and then start a running average again then it wont take that much time. For that if there is a good algorithm it was very useful – 0xAB1E Dec 12 '14 at 05:16
  • The problem is that there's no reliable way to eliminate the error values if you have no past data to compare it with. Whereas you can probably throw out values that are above 220 or so, and also values that are below 30 in most cases, there's no way, without additional information, that you can discard an initial value of 200. You could *potentially* get rid of the erroneous values if they're significantly larger than the following values. So if the first 5 beats are 200 and then the thing settles down to give you values around 100, it's likely that you can safely discard the initial values. – Jim Mischel Dec 15 '14 at 16:55
  • is there any standard algorithm which always align for a stable rhythm instead of just averaging and which has more weightage for last stable rhythm. Like if the last 5 beats as suggested by Jim Mischel founds to be stable it is to be automatically selected. Is there any general standard algorithm for that? – 0xAB1E Dec 16 '14 at 08:15

0 Answers0