-1

I have a cron job on a server which once a day uses wget to download "Earth Orientation Parameters" and leapsecond data from a NASA server. Specifically:

wget https://gemini.gsfc.nasa.gov/500/oper/solve_apriori_files/usno_finals.erp -O .eops
wget https://gemini.gsfc.nasa.gov/500/oper/solve_apriori_files/ut1ls.dat -O .ut1ls

This works fine. However it seems when the server is unavailable, wget clobbers my local files (filesize 0). Is there anyway to tell wget to abort if the server not available and leave the local file unaffected. (The files contain predictions of a couple of months, so missing the update for a few days until the server comes back is not a problem).

Chris
  • 105
  • 5
  • 1
    Script something a little more resilient in bash using curl, perhaps? – ceejayoz Mar 10 '17 at 04:11
  • It would be nice if whoever down voted this would make a comment why they did so – Chris Mar 10 '17 at 20:47
  • It wasn't me, but I'm on the fence about this being on-topic here. Might be more suited for StackOverflow, and that might be the reason for the downvote. It's not required to explain a downvote in the SE network, and given the number of people around here I wouldn't sweat a single drive-by one. – ceejayoz Mar 10 '17 at 20:50
  • Thanks ceejayoz. I chose SeverFault over StackOverflow because a quick google indicated more wget questions here! – Chris Mar 10 '17 at 20:55

1 Answers1

2

That's the documented behavior of -O, so you shouldn't be using it if this is not the behavior you want.

By default wget names files using the same name given by the server (version-dependent), or if none was given, by the basename of the URL. Since you want a different name, you should take advantage of this.

For example, you can download the file and then only copy it over the existing file if the download was successful.

wget https://gemini.gsfc.nasa.gov/500/oper/solve_apriori_files/usno_finals.erp && \
mv usno_finals.erp .eops

Because wget timed out, no usno_finals.erp was created, wget returned an error exit code, and mv was never called.

When someone at Goddard gets their head out of their ... whatever ... and fixes their server, you'll be able to see that the file gets created as expected.

Michael Hampton
  • 244,070
  • 43
  • 506
  • 972
  • Thanks Michael, I will test later and get back to you. Apparently its a firewall problem - servers are OK. Wont be fixed as its their night time. – Chris Mar 10 '17 at 08:13
  • This is confirmed to work. Using: `\rm -f usno_finals.erp; wget http://gemini.gsfc.nasa.gov/500/oper/solve_apriori_files/usno_finals.erp && \mv -f usno_finals.erp .eops` The explicit `rm` otherwise wget will download the file as `usno_finals.erp.1` if the file already exists, and the "wrong" file will be renamed to .eops – Chris Mar 10 '17 at 20:48