0

I have an issue where I am running out of ports when using RSH to start a script remotely.

I have a script that i need to run that has been pushed out to every server. I have a list of servers (hostfilelist)

Basically, I have a simple loop that will run them in paralell.

for host in `cat hostfilelist`; do
rsh $host ksh script.ksh &
done

Problem is there are like 2k servers and I am hitting a limit of 512, (assuming port range is 512-1023 for RSH based on documents i have read).

How can i get around this?

nitrobass24
  • 371
  • 2
  • 8
  • 21

1 Answers1

0

With your code you would not only run into the "secure" port limitation of rsh, but you may also hit a file descriptor limit (check with ulimit -n); each network connection consumes a file descriptor as well.

What you are doing with your code is to go through the hostfilelist and for each host run an rsh command which is put in the background (on the source server) with the ampersand. Each of these connections is kept open in the background until the script on the remote host finishes.

You are much better off in this situation to put the execution of the script in the background on each remote host, so your rsh command comes back immediately after starting the remote job and thus frees up the network connection (and port) again. To do so, rewrite the second line in your code to

rsh $host "ksh script.ksh &"

However, you may still run into issues with port reuse (see TIME_WAIT status on netstat output) if things happen too fast.

And I'd strongly recommend to let go of rsh and use ssh instead.

kkeller
  • 3,047
  • 3
  • 15
  • 11
  • This script I am running is a discovery script to identify all other scripts/jobs in the Unix environment that rely on R protocols, so we can upgrade them to use SSH, then turn off the R protocols. Ideally I would use ssh, but all the backend infrastructure is not quite in place yet for this environment. – nitrobass24 Oct 09 '12 at 16:38