In the book 'Realtime Programming in Java' there is an example which I need help with:
To avoid explosion, there is a deadline for the pump to be switched off once the methane level exceeds a critical threshold. The deadline (D) is related to:
- (T) methane sampling period
- (R) rate at which methane can accumulate
- (M) margin of safety between the level of methane regarded as critical and the level at which it explodes.
- R(D+T) < M
It is assumed that presence of methane pockets may cause levels to raise rapidly, and therefore a deadline requirement of 200ms is assumed. This can be met by setting the rate for the methane sensor at 80ms, with a deadline of 30ms. Now the displacement between 2 readings is at least 50ms.
Can someone explain me this, please?