1

I was testing node-fetch and decided to use it in a project. In this project, I repeatedly fetch a resource, up to 600 times over the course of a couple of minutes. However, when I was testing Node-Fetch, I discovered some odd behavior: when I looped the fetch it would increasingly take longer to fetch. I proved this by generating time stamps (see them here) to the i number of iterations. This is my test code:

for(let i = 0; i < 600; i++){
    var start_time = new Date().getTime();
    fetch('http://localhost:3000').then(a=>{
        var time = (new Date().getTime() - start_time) + 'ms'
        console.log(time + ' - ' + i)
    })
}

Is there a way to work around this? I think it has something to do with threading, is there an http/https module work-around?

Also, is there a reason why the i iteration numbers are out of order?

Edit:

I realized that the reason the fetch was becoming drastically ratelimited by postman and was causing the intense wait, however, when I hosted a simple localhost app and ramped up the iterations to 60000, it took much, much longer than expected, and dropped off about 30000 iterations. Is there a reason for this? Here's the end logs:

15059ms - 15136
15059ms - 15137
15060ms - 15138
15060ms - 15140
15060ms - 15139
15061ms - 15142
15061ms - 15141
15061ms - 15143
15061ms - 15144
15062ms - 15145
15062ms - 15147
15062ms - 15146
15063ms - 15148
15063ms - 15149
15063ms - 15150
15064ms - 15152
15064ms - 15151
15064ms - 15153
15066ms - 15155
15066ms - 15154
15067ms - 15156
15067ms - 15157
15067ms - 15158
15067ms - 15159
15070ms - 15160
15073ms - 15161
15074ms - 15162
15074ms - 15163
15074ms - 15164
15074ms - 15165
15075ms - 15166
15075ms - 15167
15076ms - 15168
15076ms - 15169
15076ms - 15170
15077ms - 15171
15077ms - 15172
15077ms - 15173
15077ms - 15174
15078ms - 15175
15078ms - 15176
15078ms - 15177
15081ms - 15178
15081ms - 15179
15082ms - 15180
15082ms - 15181
15082ms - 15182
15083ms - 15183
15083ms - 15184
15083ms - 15185
15085ms - 15186
15086ms - 15187
15086ms - 15188
15086ms - 15189
15087ms - 34758

Is there a reason why: 1) the iterations were drastically under what was expected and 2) the time in ms is drastically higher than expected? Here's the code for the edited test:

for(let i = 0; i < 60000; i++){
    var start_time = new Date().getTime();
    get('http://localhost:3000').text().then(a=>{
        var time = (new Date().getTime() - start_time) + 'ms'
        console.log(time + ' - ' + i)
    })
}

Thanks!

imaginate
  • 567
  • 2
  • 13
  • 3
    I would think this is rate limiting on the server side – Dominik Dec 28 '20 at 22:06
  • @Dominik ah, thanks so much! I will test this locally. Thanks – imaginate Dec 28 '20 at 22:07
  • 1
    @Dominik can you answer this with your comment? I just tested locally and confirmed that what you said was true, my api was limiting the calls (I honestly don't why I set that to true). Thanks so much, this cleared up a big mystery – imaginate Dec 28 '20 at 22:08

2 Answers2

1

As mentioned in the comments this is likely due to rate limiting on the server side. Running the same test locally will likely yield a different result.

Edit: Your localhost server will be full with requests after a while and will have to work off the load which is why it gets slower and slower over time. One rule of thumb here to help you: http requests are expensive and the first bottleneck you should avoid.

Another way to deal with this is to add a load-balancer in front for your server so you can scale up your server to multiple machines.

In the end you need to decide between a different way to get that data from the server (maybe use a socket connection that will allow you to receive only data when things have changed on the server side) or scaling your servers.

Dominik
  • 6,078
  • 8
  • 37
  • 61
0

This is a lot of questions in one, but..

Why is this happening?

The code looks fine, most likely the web site you are hammering has rate limiting (you can only hit it x number of times/minute before it intentionally starts slowing down your requests, to prevent DoS attacks).

Is there a work around?

If your slow down is due to rate-limiting on the website (try a different one with no rate limit, maybe something like google.com would work), then no.

Is there a reason why the iteration numbers (i) are out of order?

Yes. You are sending all requests out asynchronously. The order in which your output is logged depends on the time the responses are received. If they are received out of order, your is will be out of order.

Answer to edited question:

It's hard to know without knowing more about your local setup. A number of things could be causing the slowdown.

Possible bottlenecks:

  • Slow server response. Maybe the server (the endpoint of your HTTP request) is taking a long time to reply. You can determine the server response time by consulting the server logs, or try a web service that is not rate limited. A local server like nginx serving static files should perform quite well.
  • OS slowdowns. There is a limit to the maximum simultaneous number of TCP connections that an operating system can handle. As you are opening all of these connections more or less in parallel, your 60k requests is nearing the maximum of 65535 open connections. If you're running a local server on the same machine, then you're also using up a socket to respond to each TCP connection - you'd have half the number of available sockets of the theoretical maximum.
  • Hitting the Node libuv thread pool limit. Asynchronous tasks that perform IO in Node are run from a thread pool. The default size of that threadpool is 4, meaning only 4 IO tasks can be run asynchronously at once. Try increasing this limit by setting the UV_THREADPOOL_SIZE to a much larger number (say, 100 or 400), and see if you get a proportional increase in the number of requests that can be sent with no slowdown, or a reduction in the amount of slowdown.
  • Memory garbage collection. Your code doesn't contain many variables, but the library you're using (get) might. If this is the case, then changing the library or using the node built-ins for HTTP fetching will cause noticeable changes in the slowdown
Codebling
  • 10,764
  • 2
  • 38
  • 66