0

I have CODESYS Development SW version "3.5 SP15 patch 1" running on Windows PC and "CODESYS Control for Linux SL" Soft PLC version 3.5.15.10 running on Ubuntu 16.04.6 LTS in demo mode. For my learning task I need to get the current RTC time with nanosecond resolution, similar to what I would get with the following c-code in 'timestamp_now':

struct timespec time_now;
clock_gettime(CLOCK_REALTIME, &time_now);
const uint64_t timestamp_now = UINT64_C(1e9) * time_now.tv_sec + time_now.tv_nsec;

Checked the standard libraries included in CODESYS, cannot find what is required, found only: 1) SysTimeRtcHighResGet: current RTC time with milli-second resolution, the resolution is not enough for my task; 2) SysTimeGetNs: Looks like uptime with nano-second resolution; I cannot use this value for my task;

Any idea if any free (probably, in a demo mode) library can be used for getting the current RTC with nanosecond resolution in my Soft PLC?

  • I do not think it is possible. PLC is not designed to solve such tasks that are measured in nanoseconds. – Sergey Romanov Nov 20 '19 at 05:35
  • Thank you very much! In "Device Reader" for "CODESYS Control for Linux SL" running in demo mode I can see "C-Integration" : "NO", does this mean that it is NOT possible to create a working module (Shared or Static library) written in C-language and make it work with my runtime/SoftPLC, so, its API can be invoked in code for my SoftPLC? – Alexander Chumakov Nov 29 '19 at 08:22
  • You can create a C script. But the 1 cycle time of PLC will not change. PLC have to support fast inputs, then you can make calculations independent of the main process. But even then we can talk only as high as microseconds bot, not nanoseconds. – Sergey Romanov Dec 02 '19 at 10:32
  • It is even physically not possible. Imagine we have 800Mgz CPU which is 800,000,000 times a second. But one nanosecond is 1,000,000,000. It means that the CPU cannot even work one cycle per nanosecond. How can you measure in nanoseconds? You also have to take into account that the script that you will write will take 1ms to run, which is 1000 times a second. Even if you create a script that run 1mcs, lets even exaggerate as 100 000 times a second, you still can only measure resolution 1000 nanoseconds. – Sergey Romanov Dec 02 '19 at 10:32

0 Answers0