0

I think the answer may be CPU clock speed, but basically the only way I can answer this question really is with a weird scenario which I thought about while on holiday..

Lets say there is a raspberry pi on the side of the road, counting how many cars have passed by having a focused IR light or laser stream being broken every time a car (or any opaque mass), passes by.

If the code reads something like: While lightNotBroken (): Do nothing Else: carCount+=1

The code is happily repeating itself really fast, great. But surely there is a theoretical speed (unrealistically fast), which if met, could cheat the program and not get counted as a car because the program wasn't executed fast enough to notice the broken stream of light..

What determines this speed?

TGH101
  • 19
  • 3
  • On a PI, the biggest impact is Linux multitasking, which might interrupt your program at critical moments. Other than that, there is the speed of CPU, memory, GPIO pins, and of course the analog circuit for the light sensor. Google "real time", there are operating systems and also processors dedicated to allow for predictable (and fast) program execution, since this is very important for many control applications - e.g. the brakes and ABS in your car should apply rather quickly even if controlled by software. – Erlkoenig May 11 '18 at 05:28
  • 1
    ultimately speed in your example is limited by sensor data pulling frequency – Iłya Bursov May 11 '18 at 05:30

0 Answers0