3

I have a web application polls the server every 1 second for data to update its display. I see a gradual (over night) increase of CPU usage of the browser from 6% to 30% with no app interaction or change in behavior.

The problem is easily reproduced with this code running on Chrome, where I reduced the polling interval to 100ms to get a more noticeable effect:

<html>
<body>
<script>
var i = 0;
var xhr = new XMLHttpRequest();
xhr.onload = function() {
    console.log("response", i++);
    setTimeout(send, 100);
}
function send() {
    xhr.open("GET", "/", true);
    xhr.send();
}
send();
</script>

This code can easily be run on any web server like

python -m SimpleHTTPServer 8888

With this example CPU usage increases very fast for no apparent reason. I do no processing and use setTimeout and not setInterval so I never have overlapping requests.

I'm testing with Chrome (and Safari) and still see a very fast increase in CPU usage. Any ideas why?

Avner
  • 5,791
  • 2
  • 29
  • 33
  • i see not increase in CPU usage unless i open dev-tools, http://jsfiddle.net/YuJQ6/ – c69 Aug 18 '13 at 16:02
  • c69, you are right! Although my original problem was on an embedded system using Webkit though Qt, so no dev tools there. I'll try the code above on that system and see. – Avner Aug 18 '13 at 19:39
  • ah, in that case embedded webkit might just leak. Remember: all those crappy embedded webkits are **not** Chrome, by far. We have had a lot of troubles with Awesomium (for .NET), and eventually switched to normal browser (in our case - it was acceptable). – c69 Aug 19 '13 at 21:23

1 Answers1

2

Because you are filling the console output with new line each 100 miliseconds ;)

c69
  • 19,951
  • 7
  • 52
  • 82
  • The console output is there to account for some CPU activity that would normally take place when updating the UI. Anyway, its there mainly to speed up the effect that's taking place anyway. It does not account for the INCREASE of CPU usage over time. – Avner Aug 18 '13 at 17:47