Questions tagged [wget]

GNU Wget (or just Wget, formerly Geturl) is a computer program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from World Wide Web and get. It supports downloading via HTTP, HTTPS, and FTP protocols.

GNU Wget (or just Wget, formerly Geturl) is a computer program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from World Wide Web and get. It supports downloading via HTTP, HTTPS, and FTP protocols.

Its features include recursive download, conversion of links for offline viewing of local HTML, support for proxies, and much more. It appeared in 1996, coinciding with the boom of popularity of the Web, causing its wide use among Unix users and distribution with most major GNU/Linux based distributions. Written in portable C, Wget can be easily installed on any Unix-like system.

Source: wikipedia

Man page

290 questions
4
votes
2 answers

what is the default rate unit in wget?

When I use wget to download a large file, it gives me "56M/s". I want to know if it means 56 megabits per second or 56 megabytes per second. Thanks! Like this: 231,997,440 61.4M/s in 4.0s
larry
  • 4,037
  • 9
  • 36
  • 42
4
votes
2 answers

Retrieve FTP directory tree from the command line

What would be the fastest way to recursively retrieve entire directory listing from an ftp server using wget/curl/whatever? I don't need to download any files, just directory and file names. Basically what ls -R does.
user187291
  • 141
  • 1
  • 2
3
votes
2 answers

HTTP downloads stop after some time, resuming is not possible

When I try to download a file via HTTP, the downloads sometimes stop after around 30 MB. The download rates goes down to 0 B/s and no data keeps coming. When I stop the download and resume again, the download still hangs. But when I redownload it…
cdauth
  • 941
  • 1
  • 10
  • 19
3
votes
3 answers

nginx block curl and wget

I have nginx webserver. I have a rich content site and i found that some malicious bots are trying to crawl my content. I blocked any curl or wget coming to my site like this if ($http_user_agent ~* (curl|wget)) { return 301…
Alaa Alomari
  • 638
  • 6
  • 19
  • 37
3
votes
2 answers

Using curl with cookies

I hope someone can help me out on this one. I have been trying this for a while now, with all sorts of variations and cannot seem to get this to work. I am trying to script a way to log in to https://login.three.ie/ to check account balance from…
user1718443
  • 31
  • 1
  • 1
  • 2
3
votes
1 answer

multiple post requests using wget and same base-url

wget has nice option that lets you allow downloading multiple files from same location (I mean combination of --base and --input-file) Advantage of this, is that if possible wget tries to reuse opened socket/connection. I was wondering if it's…
GiM
  • 131
  • 1
  • 1
  • 2
3
votes
1 answer

How to make wget trust my self signed certificate (without using --no-check-certificate)?

Ubuntu 12.04 OpenSSL 1.0.1 14 Wget 1.13.4 My setup: create our own CA (our_own_ca.crt) generate a certificate which is signed with the above CA (graphite.local.crt) Concatenate that cert and the CA cert into a bundle file Nginx…
quanta
  • 51,413
  • 19
  • 159
  • 217
3
votes
1 answer

Cron Daemon Wget blocked

I've got nginx on Debian 7 without cpanel. I am seting-up my Crontab like this: */45 * * * * wget "http://example.com/cron-url.php" >/dev/null 2>&1 The above cron is being blocked with 403 forbidden: --2014-12-10 05:40:01-- …
Neel
  • 1,441
  • 7
  • 21
  • 35
3
votes
3 answers

Not able to wget to ftp server

I am trying to wget a ftp server from a remote machine. The command is not getting past 'Logging in as anonymous'. This is what i am getting. wget ftp://hgdownload.cse.ucsc.edu/goldenPath/hg19/chromosomes/chr1.fa.gz --2013-09-29 22:07:53-- …
Ashwin
  • 133
  • 1
  • 3
3
votes
4 answers

Download a file from the shell when button click is required

I have a link to a file that I would like to download from the shell. Unfortunately, the URL redirects to a software license agreement page which requires clicking an accept button. This is fine when I'm in a standard browser but when I'm in the…
Brett
  • 165
  • 1
  • 10
3
votes
3 answers

Reboot if tomcat7 service is not responding

I am running a web app on a Tomcat server. There is a hard-to-detect problem within the server code that causes it to crash once or twice everyday. I will dig in to correct it when I have time. But until that day, in a problematic case restarting…
3
votes
3 answers

why i cannot download jdk from oracle web site directly without AuthParam?

that is download with the following command, why it fails to download that file? wget http://download.oracle.com/otn-pub/java/jdk/6u35-b10/jdk-6u35-linux-i586.bin the following command works, but that AuthParam may not work after a while, why?…
giantforest
  • 239
  • 1
  • 4
  • 15
3
votes
3 answers

Download Git Zipball in Unix

I'm trying to download a zipball of a git repo: e.g. wget https://github.com/zeromq/jzmq/zipball/master This works fine in a web browser but on unix the file gets a weird name...how do I do this?
DD.
  • 3,114
  • 11
  • 35
  • 50
3
votes
2 answers

Can someone explain to me what "wget -O - -q icanhazip.com" mean?

I've used this command to find the IP of the server. Can someone explain what the command means? I want to learn. So far I know: "wget" is a free utility for non-interactive download of files from the Web…
user101699
  • 153
  • 1
  • 4
3
votes
2 answers

Curl always returns same 404 page

No matter what URL I specify for curl I always get the same HTML 404 Error page back. If I use the --verbose option, it looks like curl always connects to the same IP address. $ curl --verbose http://www.edgeoftheweb.co.uk * About to connect() to…
Jon
  • 161
  • 2
  • 3
  • 11