22

I need to set up a cron job in cpanel that calls a URL (on the same server) once a week. I was going to use wget but it turns out this is disabled on the shared server being used.

Is there an alternative to wget? I've heard that curl can be used but I don't know how to set that up in a cron command.

Also, what's the command to make the cronjob do nothing on completion?

Any ideas greatly appreciated!

MikeyB
  • 39,291
  • 10
  • 105
  • 189
davidhyland
  • 335
  • 1
  • 2
  • 4

3 Answers3

29

instead of using wget, curl works like this:

curl --silent http://domain.com/cron.php

which will work in the same way as wget. if its a php file you are launching, is there any reason you cant run it via the command line php interpreter like so:

php -q /path/to/cron.php

same on a webserver request and often will work much faster and without certain timeout restrictions present when called via webserver/curl

anthonysomerset
  • 4,233
  • 2
  • 21
  • 24
  • Calling curl that way will print the result to stdout. If you want to behave equivalent to wget, and save the output to a file name based on the url, you also want to add the --remote-name flag. – andol Aug 09 '11 at 16:34
  • does the --silent mean that there's no response or output? – davidhyland Aug 09 '11 at 16:39
  • 1
    Also, calling the php script from the command line might not necessarily yield the same result. In addition to the possibility of different configuration there is also the not uncommon scenario of the script being run as a different user that way. It alls depends on the setup. – andol Aug 09 '11 at 16:43
  • Well, I got it to work using "curl --silent http://domain.com/script.php >/dev/null 2>&1" but it also works fine with "curl -o --url http://domain.com/script.php >/dev/null 2>&1" what's the difference? – davidhyland Aug 09 '11 at 17:14
  • 1
    The first command sends all output to /dev/null, the second will create a file called `-url` in the home directory of the user running the script, which contains the output of your php file, all other output goes to /dev/null. – user9517 Aug 10 '11 at 07:21
  • from CLI things run much faster bypassing web server but note that in most cases it will run as root rather than your www user, it might cause some problems like creating files with wrong permissions or so. – adrianTNT May 18 '17 at 21:22
13

If curl is available you could try something like

1 1 * * 0  /usr/bin/curl --silent http://example.come/some.php &>/dev/null

That should cause curl to be completely silent so you don't get any email from it on completion.

user9517
  • 115,471
  • 20
  • 215
  • 297
  • 4
    I would recommend using the option --show-error too, so it is silent for normal operation but will yield an error if it happens. – emerino Mar 12 '15 at 19:45
10

I'd suggest to add "-m" parameter in addition to --silent as this parameter sets the maximum time allowed for the transfer. Imagine you call the cron every minute and the script takes 2mins - this can have bad impact on the server load or other things.

1 1 * * 0  /usr/bin/curl -m 120 -s http://example.come/some.php &>/dev/null
Johnny Vietnam
  • 101
  • 1
  • 3