I have an Arduino that reads values from a sensor every sample period T (in this example, T = 10ms approximately). After every read, I want to do a bunch of computations on the sensor values. Before designing an algorithm, I want to get a realistic feel for what my effective computational budget is. Here is the code outline:
void loop() {
//read sensor measurements into variables
//do computations
delay(SAMPLE_PERIOD);
}
What I want to do is insert some sort of test that could realistically assess how many multiplications, adds, and loads I could afford to run in the time period before the next loop starts. My first idea was to insert something like:
for (i = 0; i < N; i++) {
//load operation
//multiply operation
//add operation
//store operation
//or some mix of the above
}
Where I would increase N to see how many operations I could run per sample period. I would use a counter/timer to see if this took longer than T.
What do you think?
For reference, the microcontroller board is the Teensy 3.1: https://www.pjrc.com/teensy/teensy31.html#specs