3

I thinking about a efficient way of implementing a threshold check (with optional hysteresis and delay).

Requirement/situation:

I want to check e.g. an non-negative value (e.g. absolute pressure) against an upper threshold and an lower threshold resulting in an over pressure error bit or an under pressure error bit respectively. There are going to be multiple set and reset thresholds. (I want to organize in an array)

Considering hysteresis, There shall be different set and reset vlaues: e.g. for the over pressure case a set threshold of pressure p_set (error bit gets set) and a reset threshold of p_reset <= p_set. Same applies for the unde pressure but here p_reset >= p_set, which causes the comparison operator to invert:

// Over pressure detection
if (pressure > overPressureSetThld){
  // Over pressure -> set error bit
else if (pressure < overPressureResetThld){
  // Reset over pressure error bit
}

// Under pressure detection
if (pressure < underPressureSetThld){              // Inverted comparison operator
  // Under pressure -> set error bit
else if (pressure > underPressureResetThld){       // Inverted comparison operator
  // Reset under pressure error bit

Alternatives:

Thinking about this, I see two alternatives:

  1. Implement it straigt forward like above -> bigger code size/"duplicate" code (especially with a delay considered)
  2. Compare relative values (substraction and abs, requires a reference pressure) -> reduced code size because only one if-elseif needed but higher runtime load e.g.:

    if (abs(pressure - REFRENCE_PRESSURE) > relSetThld){ // Threshold relative to reference and positive // Set over/under pressure error bit else if (abs(pressure - REFRENCE_PRESSURE) < relResetThld){ // Threshold relative to reference and positive // Reset over/under pressure error bit

Question:

I tend to use alternative 2 but I'm asking myself (and you) if there is an better way to do this. Suggestions welcome!

Best Christoph

  • 3
    Is code size really that critical? We are talking about +/- 20 bytes or something. That shouldn't matter in 99% of all applications. – Lundin Sep 27 '19 at 07:47
  • Have you enabled optimizations? Enabling even the lower optimization levels will likely have far bigger impact that manual attempt at optimizing if statements. – user694733 Sep 27 '19 at 08:25
  • @Lundin: Kind of yes, because there might be multiple thresholds plus multiple pressures. – ElvisIsAlive Sep 27 '19 at 08:33
  • @user694733: Your right but I was wondering about the C code itself – ElvisIsAlive Sep 27 '19 at 08:33
  • Hysteresis cycles imply saving the state of triggering so you apply **only** the proper level comparisons and not the corresponding to the non-triggered state. – Luis Colorado Oct 02 '19 at 06:08

1 Answers1

2

The most efficient way to implement something like checking several levels, considering execution speed, is to form a "binary search"/BST out of the various levels. Which in practice means writing an if-else chain:

if(val < level_mid)
  if(val < level_low)
    // ...
  else
    // ...
else
  if(val < level_high)
    // ...
  else
    // ...

You can't really beat the above in terms of performance and branches. In terms of readability/maintenance, the optimal code would rather look like:

if(val < level_lowest)
  // ...
else if(val < level_low)
  // ...
else if(val < level_mid)
  // ...
else if(val < level_high)
  // ...

This code is fairly effective too, but much more readable/maintainable than the "binary search" alternative. As always, disassemble and see for yourself.

But then of course, manually optimizing code without a specific system in mind isn't very sensible. Suppose for example that you are using a 8 or 16 bit CPU. All that will matter in terms of performance in that case is the size of the integer types involved. Similarly, using floating point types on a MCU without FPU will lead to highly inefficient code.

If optimizing for code size, you should be looking at entirely different aspects than the amount of branches. Getting rid of bloated library calls being the #1 thing (I'm looking at you, stdio.h).

Doing something like taking absolute values just for the sake of reducing code size is nonsensical - it is not at all obvious that it will reduce code size. What it will certainly do however, is to increase complexity. Which in turn tends to lead to larger code and more bugs. Apply the KISS principle.

Lundin
  • 195,001
  • 40
  • 254
  • 396
  • Good points! Considering the binary search, this will not be applicable in my case due to there wil be multiple thresholds e.g. for over pressure resulting in 4 independent thresholds to be considered. The reset threshold of range 2 might be smaller than the reset threshold of range 1. So, the cascade you've suggested won't work here. Regarding the KISS you're probably right ^^ I figured it might be "overengineered" and the difference in outcome might not be significant. I'm just curious. – ElvisIsAlive Sep 27 '19 at 08:42
  • @ElvisIsAlive Then, obviously, you will need to run different code for range 1 and 2. Or the same code with different parameters, if applicable. – Lundin Sep 27 '19 at 08:45
  • The comparison only has five possible outcomes. (six if you allow the middle tresholds to *overlap*); the two bits only have three possible states. Enumerate, and construct an action-table (eight? possible outcomes) [overengineered?] – joop Sep 27 '19 at 13:37