5

I measure nodejs event loop using process.hrtime() also I count percentiles for measuring intervals. Here is my simple benchmark example

I have run this test on PC with CPU i7-4770.

This is how graph looks like when script runs on Windows 7(same type of graph is on OSX): windows 7 Asix X(horizontal) - time. Asix Y(vertical) - 90th percentile latency in μs (microseconds)

And this is how graph looks like when script runs on CentOS 7/6: enter image description here Asix X(horizontal) - time. Asix Y(vertical) - 90th percentile latency in μs (microseconds). Wavelength is about 2 minutes.

As you can see, in general latency on linux is bigger than on windows. But real problem is periodic peaks, when average latency and number of high latency are increasing.

I`m not sure is this OS, v8 or libuv issue. Is there any good explanation of this behavior? Or how can it be fixed?

Andrew Prock
  • 6,900
  • 6
  • 40
  • 60

0 Answers0