0

I am writing a script which fetches some important data (via cronjob) from a url and sends an email if it failed, however, it doesn't seem to be recognising whether it failed or not (checking wget's return value [if]):

wget_output=$(wget --tries=1 --timeout=300 -r -N -nd -np --reject=html https://myurl/ -P ../data)
if [ $? -ne 0 ]
then 
    echo "send email here" >> ./test.txt
else
    echo "send email here" >> ./test2.txt
fi

It just tries 1 time (as told) and then gives up, but doesn't recognise that it has either succeeded or not. I'm presuming I'm not handling the exit code correctly or something. Any idea? Thanks

Community
  • 1
  • 1
timhc22
  • 7,213
  • 8
  • 48
  • 66

1 Answers1

2

What is the version of wget you are using? The exit status section for man page of version 1.15 says:

In versions of Wget prior to 1.12, Wget's exit status tended to be unhelpful and inconsistent. Recursive downloads would virtually always return 0 (success), regardless of any issues encountered, and non-recursive fetches only returned the status corresponding to the most recently-attempted download.

jil
  • 2,601
  • 12
  • 14
  • interesting, although I'm surprised that `test2.txt` is not created in that case (neither file is created) – timhc22 Apr 26 '16 at 20:34
  • As you are running this in a cron job you should use absolute paths for these files: the working directory might not be what you expected. And make sure that the cron user has write permissions to that target directory. Maybe also worth to check if by any chance shell option `errexit` has been set active. – jil Apr 26 '16 at 20:48