5

My node application is making requests to two Servers, A and B. To server A, it waits for one request to finish before making the next one. To server B it makes 20 requests a second without waiting. When I'm making the requests to Server B, the requests to Server A take a very long time. When I don't make the requests to server B, they go quickly. The requests to server B pile up, but there are no more than a few hundred in process simultaneously.

I've run the exact same application, with the same node version on a Joyent smartos instance and I don't have this problem, so I assume its an issue with the limits the operating system sets, and not with the limits that node sets. In node I do have maxSockets set to 10000 as explained here, http://markdawson.tumblr.com/post/17525116003/node

I'm running my application with upstart though I don't know if I have the problem without it (that would be my next test). In my upstart config file I have limit nofile 90000 90000. There are some other limits I can raise as documented here, http://upstart.ubuntu.com/wiki/Stanzas#limit, but I don't know what they do. Could one of these be causing the problem? Where else might my Ubuntu machine's limits be set?

I should add that I'm launching the upstart program via Monit in case that's relevant.

user1585789
  • 648
  • 6
  • 14
  • Perhaps you could try asking this question on serverfault? You may reach a more specific group of peers who could help you with this issue. I've never had a problem similar to this. I've Ubuntu, Debian, and CentOS to host node applications without any issues. Does your application use a lot of memory? Are you certain the issue you're illustrating is actually the root cause. What exactly are you doing? The problem could be something unrelated to everything you just illustrated. I doubt the OS has anything to do with it. If you're using lots of memory, keep in mind node has its limitations. – tsturzl Nov 18 '14 at 02:28
  • Will do. Thanks for the suggestion. And I'm using a gig and a half of memory on the 64 bit version which isn't a problem these days. And I don't experience any slowness other than with the requests. – user1585789 Nov 18 '14 at 08:25
  • The problem isn't "slowness", node.js(or v8 rather) has a limit of about 1.5GB limitation for memory usage, which it seems you're creeping up on. Are you running node.js as a single process or are you forking it to every processor on the machine and loadbalancing? As for the memory limitations, I'm not quite sure how they work, nor am I sure if they are static between each user, and it could also be dependent on which version of V8 you're using with node.js. Check this out https://code.google.com/p/v8/issues/detail?id=847 – tsturzl Nov 18 '14 at 18:08
  • The issue is slowness. My requests to the server are taking longer to complete even though everything is is running fine. Also the V8 version packed into Node 0.10 has higher limits than you speak of, certainly the 64-bit version does. I've successfully used over 8gb in it. the limit can be raised quite high in the 64-bit version with the max_old_space_size argument. There are other crazy things that happen though if you use too much of a particular type of memory. For example, I had an issue where creating really large single object caused out of memory. – user1585789 Nov 18 '14 at 18:36
  • And remember, the request slowness doesn't happen on the exact same node version on a Joyent smartos instance. – user1585789 Nov 18 '14 at 18:36
  • Are there any other external services that could be the bottleneck like a DB or Message Broker? Can you provide any source code? Are you installing node.js from a package manager on either host? Or are you installing them from the same source on each. I've had problems in the past using the latest stable kernel, which kernel version are you using? ~3.17.x had given me connection problems in the past, though this likely not be your problem its worth noting. – tsturzl Nov 19 '14 at 21:48
  • This could also be a network configuration issue, are you using UDP for any of this? – Lumi Nov 22 '14 at 17:19

1 Answers1

0

You don't mention how you are talking to ServerA or ServerB, but Node's HTTP library has a default limit of six connections per host (protocol/server/port) combination. You can increase this with http.globalAgent.maxSockets = 20; or whatever you would like the maximum to be.

Other issues could be related to open file/socket limits in your OS... You want to look at /proc/sys/fs/file-max instead

From recent linux/Documentation/sysctl/fs.txt:

file-max & file-nr:

The kernel allocates file handles dynamically, but as yet it doesn't free them again.

The value in file-max denotes the maximum number of file- handles that the Linux kernel will allocate. When you get lots of error messages about running out of file handles, you might want to increase this limit.

Historically, the three values in file-nr denoted the number of allocated file handles, the number of allocated but unused file handles, and the maximum number of file handles. Linux 2.6 always reports 0 as the number of free file handles -- this is not an error, it just means that the number of allocated file handles exactly matches the number of used file handles.

Attempts to allocate more file descriptors than file-max are reported with printk, look for "VFS: file-max limit reached".


Specifically to Ubuntu, if you have a lot of ufw (firewall) and/or iptables rules in place this can effect things too.

Community
  • 1
  • 1
Tracker1
  • 19,103
  • 12
  • 80
  • 106