1

we are using LabVIEW Real-Time with the PXI-8110 Controller. I am facing the following problem: I have a loop with 500µs period time (time-loop) and no other task. I write the time each loop iteration into ram and then save the data afterwords. It is necessary that the period is exact, but I see that it is 500µs with +/- 25 µs. The clock for the timed loop is 1 MHz.

How is it possible to have 500µs - 25µs. I would understand if I get 500µs + xx µs when my compution is to heavy. But till now I just do an addition nothing more.

So does anyone have a clue what is going wrong? I thought it would be possible to have resolution of 1µs as NI advertise (if the computation isn't so heavy).

Thanks.

steffenmauch
  • 353
  • 5
  • 16

1 Answers1

1

You may need to check which thread the code is working in. An easier way to work is to use the Timed Loop as this will try and correct for overruns. Also pre-allocate the array that you are storing the data into and then replace array subset which each new value. You should see a massive improvement with this way. If you display that value and are running in development mode you will see jitter +- time as you are reporting everything back to the host. Build the executable and again jitter will shrink.

MikeB
  • 11
  • 2