163

I'm using Wget to make http requests to a fresh web server. I am doing this to warm the MySQL cache. I do not want to save the files after they are served.

wget -nv -do-not-save-file $url

Can I do something like -do-not-save-file with wget?

T. Brian Jones
  • 13,002
  • 25
  • 78
  • 117
  • 3
    This is a better question for Super User or Unix.SE. – Matt Ball Mar 13 '12 at 20:19
  • 6
    as a developer I want to test a server with wget and than this question is seriously on-topic. As on-topic best answer got 142 positive votes, so this question should be reenabled as on-topic – Mladen Adamovic Sep 30 '16 at 17:43
  • Alternatively to -q and -O, use the flags -nd (no directory) and --delete-after. – Suzana Sep 25 '18 at 09:36
  • 2
    According to the help doc(wget -h), you can use --spider option to skip download(version 1.14). – rocky qi May 09 '19 at 04:34

3 Answers3

291

Use q flag for quiet mode, and tell wget to output to stdout with O- (uppercase o) and redirect to /dev/null to discard the output:

wget -qO- $url &> /dev/null

> redirects application output (to a file). if > is preceded by ampersand, shell redirects all outputs (error and normal) to the file right of >. If you don't specify ampersand, then only normal output is redirected.

./app &>  file # redirect error and standard output to file
./app >   file # redirect standard output to file
./app 2>  file # redirect error output to file

if file is /dev/null then all is discarded.

This works as well, and simpler:

wget -O/dev/null -q $url
perreal
  • 94,503
  • 21
  • 155
  • 181
  • what do the ampersand and greater than do in this command? can you just pipe the output to /dev/null ... wget -qO- | /dev/null ... ? – T. Brian Jones Mar 13 '12 at 20:22
  • Uh, I tried this and it didn't work for me. From what I found out, the ampersand should come after the greater than symbol. – Tiago Espinha Jul 12 '13 at 10:32
  • 7
    For Windows users: wget -q -O NUL http://... Turns off logging and routes download to NUL (same as /dev/null) – vidario Nov 07 '13 at 11:49
  • 3
    wget --spider $url Will do it. – thebugfinder May 26 '14 at 00:48
  • Try this to simply throw the file away without going through stdin: `wget -P -P /dev/null` – Marcus Downing Dec 10 '14 at 17:49
  • 1
    wget -O/dev/null $url – Storm Sep 16 '15 at 08:58
  • Googlers! If you are keen on seeing header and redirect stuff (i.e. to verify your 301's, 403's etc., or well, in the OPs scenario: ensure you get 200's back…), thus not total quiet-ness but you don't want to produce meaningless files, this is for you: `wget http://example.com/foo/testing -O- 2>&1 | grep -E 'Location|\s20|\s30|\s40|\s50'` – Frank N Dec 19 '17 at 14:11
  • You don't really need all that complicated stuff. [`wget`'s man page](https://linux.die.net/man/1/wget) clearly states that using `-O` is the same as redirecting output, so there's no need to go through `stdin` and redirect the output manually, you can simply do `wget -qO /dev/null $url`. – Marco Bonelli Jan 26 '18 at 18:25
  • @MarcoBonelli, but you then you still need to handle the stderror? – perreal Jan 27 '18 at 10:07
  • @perreal no you don't, `-q` already takes care of that. Nothing will be printed to screen, errors included, so the only thing you need to throw to `/dev/null` is the downloaded file. – Marco Bonelli Jan 27 '18 at 12:26
  • Unfortunately this doesn't work with `-p`, i.e., you can't visit and fetch all page requisites and not save them. – Ayush Goel Jul 29 '22 at 18:01
53

You can use -O- (uppercase o) to redirect content to the stdout (standard output) or to a file (even special files like /dev/null /dev/stderr /dev/stdout )

wget -O- http://yourdomain.com

Or:

wget -O- http://yourdomain.com > /dev/null

Or: (same result as last command)

wget -O/dev/null http://yourdomain.com
Salathiel Genese
  • 1,639
  • 2
  • 21
  • 37
Marco Biscaro
  • 1,208
  • 10
  • 17
45

Curl does that by default without any parameters or flags, I would use it for your purposes:

curl $url > /dev/null 2>&1

Curl is more about streams and wget is more about copying sites based on this comparison.

Oleg Mikheev
  • 17,186
  • 14
  • 73
  • 95
  • 1
    Cold you explain what 2>&1 means? I get > but why the end? – apscience Sep 07 '13 at 11:01
  • 2
    @gladoscc first `>/dev/null` redirects [std out](http://en.wikipedia.org/wiki/Standard_streams#Standard_output_.28stdout.29) to /dev/null (`>/dev/null` is short for `1>/dev/null`), the second redirects [std err](http://en.wikipedia.org/wiki/Standard_streams#Standard_error_.28stderr.29) to std out. – Oleg Mikheev Sep 09 '13 at 22:33
  • 1
    @apscience Just to add. The file descriptor for std out is 1. Shells assume std out when you leave off the descriptor from a redirect. So `>foo` is interpreted as `1>foo`. When `&` follows a redirect, it instructs the shell to redirect the former file descriptor to the _same output_ as the later. In this case, `2>&1` says redirect file descriptor 2 (std err) to the same place as file descriptor 1 (std out). Since std out is already redirected to /dev/null, std err will also be redirected there. You could also write: `1>/dev/null 2>/dev/null` or `2>/dev/null >&2`. – ktbiz Mar 01 '17 at 04:27
  • This answer is correct, but then includes the same output redirection as the chosen answer. Simply running `curl http://www.example.com` is sufficient. – Aaron Cicali Jul 11 '17 at 21:23