Questions tagged [wget]

GNU Wget (or just Wget, formerly Geturl) is a computer program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from World Wide Web and get. It supports downloading via HTTP, HTTPS, and FTP protocols.

GNU Wget (or just Wget, formerly Geturl) is a computer program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from World Wide Web and get. It supports downloading via HTTP, HTTPS, and FTP protocols.

Its features include recursive download, conversion of links for offline viewing of local HTML, support for proxies, and much more. It appeared in 1996, coinciding with the boom of popularity of the Web, causing its wide use among Unix users and distribution with most major GNU/Linux based distributions. Written in portable C, Wget can be easily installed on any Unix-like system.

Source: wikipedia

Man page

290 questions
3
votes
2 answers

Why is this bash command not echoing into a variable and what can I do to improve?

I've got a bash script I'm working to improve, and have put on a great fix thanks to Dennis Williamson. Unfortunately, one of the lines no longer echoes into a variable I can manipulate, rather dumps the output directly. I'll be good to go if I fix…
editor
  • 383
  • 2
  • 5
  • 21
3
votes
2 answers

pgrep wget: what is the details of process id?

when typing pgrep wget, it show process id 10144 but how to know what is the details of this process id
kopeklan
  • 169
  • 2
  • 2
  • 6
2
votes
2 answers

`wget`-ting a website for "local" browsing on a different domain

I need to mirror a website and deploy the copy under a different domain name. The mirroring procedure should be all automatic, so that I can update the copy on a regular basis with cron. The mirror MUST NOT be a real mirror, but it MUST be static…
Lucio Crusca
  • 420
  • 3
  • 12
  • 33
2
votes
1 answer

Automated export of phpMyAdmin SQL download with wget

I'm using the following script to attempt to download an SQL export from a phpMyAdmin installation (adapted from this Stackoverflow question) read -p "Username: " USERNAME read -p "Password: "…
Jaap Joris Vens
  • 601
  • 3
  • 8
  • 20
2
votes
1 answer

wget hangs/freezes when downloading file into NFS

I'm running some experiments with Amazon EFS (General purpose) and EC2 and I have the issue that EFS seems to act unstable. Commands that involve the mounted file system would hang or freeze. For example, wget (file of 8gb) downloads a file at…
mitchkman
  • 159
  • 1
  • 5
2
votes
1 answer

How to set index.html to redirect wget query to an other page/file?

If my server is at example.com, running $ wget example.com will only download the file index.html. How can I make wget either download another file instead of index.html download this file along index,html I have all redirection I could find…
user123456
  • 563
  • 1
  • 7
  • 20
2
votes
1 answer

how can I download web pages from the command line in windows

Well, I'm trying to do some automation using batch file to speed up my daily routine at work, and I need to solve these little questions: I want to install wget command, but using only the pure Windows Command Prompt I want to download some things…
DaMonki
  • 121
  • 6
2
votes
1 answer

Download an SSL certificate from a remote website through a proxy

I want a script checking my server's certificate. There are very good answers around here how to get this with openssl s_client or gnutls-cli, which works fine.... but NOT if you are behind a proxy! Doh! I did not found a possibility to tell these…
avh
  • 41
  • 1
  • 3
2
votes
1 answer

APT-GET behind a proxy with Digest Authentication

I'd like to use apt-get (and other ubuntu tools) to download software and keep it updated. Unfortunately, my company has set a squid proxy to accept digest authentication only. I've seen somewhere (can't find the link again) that APT-GET uses WGET,…
Victor Ribeiro
  • 161
  • 1
  • 6
2
votes
0 answers

Recursively get files over FTP with explicit SSL

Is there a way to recursively get a whole directory from another server using FTP with explicit SSL on Linux? I have tried a number of wget combinations including with and without the passive flag different SSLv/TLS flags etc and none seem to work.…
James
  • 183
  • 1
  • 2
  • 13
2
votes
2 answers

nginx can't execute many requests

I have a question about nginx tune. I have an application which I want execute 200 times every second. I created bash file and used wget with bqO switch for execute. But it has a problem. When the number of requests is greater than 100. nginx not…
2
votes
1 answer

ShellShock test shows wget and curl access

I've fixed shellshock bug on my Debian 6 server and while testing on http://shellshock.brandonpotter.com/ I get "No Vulnerabilities Found" and that's OK but they also check other things and in the test log I get: URL mydomain.net (Root URL) (Header…
Odin
  • 23
  • 3
2
votes
1 answer

Under what scenarios would Apache allow a mismatched HTTP_HOST to pass through?

Someone has managed to pass an undefined HTTP_HOST server variable to my application script, triggering a series of errors. I am quite perturbed but am unable to replicate this behaviour. My httpd server uses name-based virtual hosting with the…
Question Overflow
  • 2,103
  • 7
  • 30
  • 45
2
votes
2 answers

Cron getting timeout when attempting to access url (wget).

I'm sure this is simple and I've been digging but no answer is quite as specific as I need it to be. The goal is easy, have Cron hit a url on my server every 5 minutes. All of that is set up and functions fine, the issue is that it times out when…
user2543853
  • 19
  • 1
  • 2
2
votes
1 answer

Wget and recursive downloads

A directory listings enabled folder has 2 zip files. I am trying to download only one of them (the latest). Problem is the zip file name is a random string. The command I have so far is wget -r -nH --cut-dirs=3 --no-parent --accept=zip…
user3227965
  • 98
  • 1
  • 7