So, we use JMetter for load testing our application. If I correctly understand, JMetter use trivial logic to calculate time of request executing. It get difference between time of request started and request ended.
So, if my processor is very busy this time is increases. But the server is placed on other machine.
For test I wrote the program that generate a lot of threads and do some calculations in it. Then I start JMetter tests on empty system and system with. my program running.
Time in aggregate graph is very different.
Is there are some solutions for it? Or we should rewrite our tests on other tools (like Gatling)?