1

My scripts are using wget to retrieve data from Internet. When many users are using this script I got very high load (about 20.00), because of disk I/O. Wget is automatically started each hour by cron. I would like to limit one wget to one customer at the same time. How to do this ?

I'm using CentOS 5.7 64-bit.

Spacedust
  • 568
  • 5
  • 13
  • 28
  • Do the solution have to be secured against malicious local users? Do you expect that they would try to circumvent this limit? – kupson Feb 13 '12 at 15:01
  • No. I just see a lot of wgets running and I have made a script which kills them if load is higher than 19, but that's not good for me. – Spacedust Feb 13 '12 at 19:30

3 Answers3

2

You could add a file lock to your script. For example with flock command (package util-linux on Debian):

$!/bin/sh
(
    flock --nb 200 || echo "waiting for previous script completion"
    flock 200

    # script commands here
    wget ....

) 200>$HOME/.wgetlock
kupson
  • 3,578
  • 20
  • 19
  • It's a php script, so this might be hard to implement. – Spacedust Feb 13 '12 at 20:46
  • PHP script executed by cron? Then create small wrap-around script as shown above and call PHP script from it (replace wget ....), or use [flock](http://php.net/flock) directly in PHP. – kupson Feb 13 '12 at 21:34
1

Check out the options available to you in /etc/security/limits.conf. You can set per-user process and memory limits there.

EEAA
  • 109,363
  • 18
  • 175
  • 245
1

If it is the same request and content over and over, just cache the result for a few seconds and serve up the same content for different users.

You could use memcache to implement this easily enough.

If the content differs per user be aware that queueing like you ask will cause long delays.

Matthew Ife
  • 23,357
  • 3
  • 55
  • 72