0

Sorry if this seems to be a very simple command, but I have been searching for hours without any solutions and exhausted my very limited knowledge of Linux. :-)

I need to use some command to process the following URL from a Centos Linux command line.

http://www.example.com/?sm_command=build&sm_key=kdfs7kj6dgo3sigj34df1

I have attempted to use wget and curl with various options enabled, with single quotes, and double quotes around the URL. All attempted commands pulled the home page and dropped everything from the question mark. This URL runs successfully in a browser and simply displays "DONE."

Ultimately, this will end up in a cron job that runs every so often.

Please advise me on what would be the proper syntax to complete this.

Thanks so much in advance for the assistance.

Scott
  • 3
  • 1
  • 2
  • Scott, a word of warning, though: if this (build?) server of yours is open to the 'Net, please consider switching to HTTPS as soon as possible to close the vulnerability - the request over HTTP may be sniffed at a dozen likely and unlikely places, putting your infrastructure at risk. – Deer Hunter Apr 07 '13 at 09:50

1 Answers1

1

It should work with single quotes, if not escape the & with \&

The behavior depends on the shell you are using, so YMMV

Without quotes, bad request:

david@atl:~$ wget http://www.example.com/?sm_command=build&sm_key=kdfs7kj6dgo3sigj34df1
2013-04-07 01:54:10 (68.1 MB/s) - `index.html?sm_command=build' saved [1111]

With single quotes, successful request:

david@atl:~$ wget 'http://www.example.com/?sm_command=build&sm_key=kdfs7kj6dgo3sigj34df1'
2013-04-07 01:54:24 (99.7 MB/s) - `index.html?sm_command=build&sm_key=kdfs7kj6dgo3sigj34df1' saved [1111]

Without quotes, escaped ampersand, successful request:

david@atl:~$ wget http://www.example.com/?sm_command=build\&sm_key=kdfs7kj6dgo3sigj34df1
2013-04-07 01:57:31 (102 MB/s) - `index.html?sm_command=build&sm_key=kdfs7kj6dgo3sigj34df1.2' saved [1111]
David Houde
  • 3,200
  • 1
  • 16
  • 19
  • Thanks for the quick response. I am getting no joy on either. Both are pulling 36K of data. Any other ideas? – Scott Apr 07 '13 at 06:16
  • Sorry, I fat fingered the URL. The one that worked was wget single quotes and the ampersand not escaped with a \ – Scott Apr 07 '13 at 06:21