0

I've built a web crawler and it's always getting stuck after a while. It just stops there, though script doesn't break.

async.eachSeries(links,function (item,callback){
    request({
      url: item.uri,
      encoding: null,
      headers: {
        'Cookie': 'SESSIONID=7ik26oegkbhbtsmsdrjvfaerk4'
        }
      }, function(e, r, b) {
        console.log("won't display this approx. after 1000th iteration")
        callback();
    })
})

Node version is 0.10.32

What can cause this?

Sasha Davydenko
  • 794
  • 7
  • 12
  • are your links valid? maybe some are not and they are responding very slowly. could you try with the same good known link over and over. You could also try adding a shorter timeout to your request "options" – chriskelly Sep 24 '15 at 21:18
  • they are valid. after restart it runs smoothly. – Sasha Davydenko Sep 24 '15 at 21:22
  • Did you try setting `pool: false` in your request options? – mscdex Sep 24 '15 at 21:25
  • Yes, I did set this via `require('request').defaults()`, but it seemed like freezes continue to happen. Setting `timeout` actually did help, now script throws error instead of just freezing. – Sasha Davydenko Oct 02 '15 at 07:40

0 Answers0