I am trying to write some code to define some boundary layers.
The data is arranged in the table like this (but there are many more rows):
Depth (um) | Replicate 1 (O2 Sat %) | Replicate 2 (O2 Sat %) | Replicate 3 (O2 Sat %) |
---|---|---|---|
0 | 10 | 11 | 11 |
-100 | 11 | 11 | 12 |
-200 | 13 | 12 | 11 |
-300 | 14 | 13 | 14 |
-400 | 15 | 15 | 15 |
-500 | 16 | 16 | 16 |
For each of these replicates I want to find the size of the boundary layer. I am defining the boundary layer as the distance above the surface (0um) at which the changes are <5% per 100um for for subsequent measurements. So I need to find the depth of the first row that results in this definition not being met. I also need it to make sure that the function is using rows 1-4 then 2-5 then 3-6 and so on not just moving down the rows 1-4,5-8 etc etc so that I can identify the first time this boundary layer definition is not met. I would like to detect this change for each replicate.
I have tried looking for some ways to approach it but I am not sure I am asking the correct questions because I am not exactly sure which type of functions to start with. I am assuming this may be some type of threshold or cutoff type function but I thought I would get some ideas on how to proceed as my searching was not getting me anywhere.
I appreciate any advice or ideas on how to get started on this. Thank you for your time in advance.