Questions tagged [wget]

GNU Wget (or just Wget, formerly Geturl) is a computer program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from World Wide Web and get. It supports downloading via HTTP, HTTPS, and FTP protocols.

GNU Wget (or just Wget, formerly Geturl) is a computer program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from World Wide Web and get. It supports downloading via HTTP, HTTPS, and FTP protocols.

Its features include recursive download, conversion of links for offline viewing of local HTML, support for proxies, and much more. It appeared in 1996, coinciding with the boom of popularity of the Web, causing its wide use among Unix users and distribution with most major GNU/Linux based distributions. Written in portable C, Wget can be easily installed on any Unix-like system.

Source: wikipedia

Man page

290 questions
0
votes
1 answer

Get a URL with variables via Linux Command Line

Sorry if this seems to be a very simple command, but I have been searching for hours without any solutions and exhausted my very limited knowledge of Linux. :-) I need to use some command to process the following URL from a Centos Linux command…
Scott
  • 3
  • 1
  • 2
0
votes
1 answer

How to check ip blacklist using curl/wget request to mxtoolbox?

I would like to add cron job to check server ip on blacklist. something like curl 'http://mxtoolbox.com/SuperTool.aspx?action=blacklist%3a142.11.193.83' | grep 'you are on a blacklist.' wget…
Alex
  • 105
  • 1
  • 3
0
votes
2 answers

What could block the output of a command?

Sometimes I'll attempt a (2>&1) redirection and some/all of the resulting output appears to be silenced. e.g. wget -O- http://localhost/test.txt 2>&1 I would expect to see a merge of contents of test.txt and the output of the transfer, but instead…
0
votes
3 answers

input wget exclusions from a file?

i'm crawling a forum and I keep stumbling across certain threads that have been going on for ten years. i can certainly exclude these using wget option: -X…
mcwizard
  • 103
  • 2
0
votes
1 answer

Prevent webpages from being downloaded by downloadmanagers with NGINX

I want to create a site that offers themes. People will be able to see the themes and use a demo but I want to prevent cheapskates from using wget or any other download manager to download the demo sites in an automated fashion without having to…
Stofke
  • 173
  • 1
  • 2
  • 11
0
votes
2 answers

What are the possible problems, when wget returns code 500 but same request works in normal browsers?

What should I be looking for, when wget returns 500 but the same URL works fine in my web browser? I don't see any access_log entries that seem to be related to the error. DEBUG output created by Wget 1.14 on linux-gnu.
markus
  • 602
  • 2
  • 9
  • 18
0
votes
2 answers

Fast (non-blocking) way to transfer many files to another server

Possible Duplicate: what is the fastest and most reliable way of transferring a lot of files? I am currently attempting to transfer over 1 million files from one server to another. Using wget, it seems to be extremely slow, probably because it…
Nyxynyx
  • 1,459
  • 11
  • 39
  • 49
0
votes
1 answer

wget does not mirror if ip is provided instead of domain name

I am trying to get a site mirroring with wget, and came across a strange behavior. Say I am mirroring an internal site named www.example.com in the following way, all seems to be working fine: wget -mkE http://www.example.com However, as I need to…
Tzury Bar Yochay
  • 727
  • 11
  • 24
0
votes
2 answers

wget works for all sites on the web but not the one hosted on that server

I currently have 2 Ubuntu 12.04 servers which are load balanced. If I go to anyone on them from the shell and type: wget stackoverflow.com The page is fetched into index.html. However, assuming the site hosted on those servers is called…
Thierry Lam
  • 6,261
  • 10
  • 27
  • 24
0
votes
2 answers

Curl and Wget returns different response code

I am trying to wget a link, this wget works fine on my local machine but it doesn't do so on the server. i tried to check the response header and i got the following on my local laptop curl -I…
Alaa Alomari
  • 638
  • 6
  • 19
  • 37
0
votes
2 answers

Use number of downloaded file in wget as variable in bash script

Can I get the number of downloaded files with wget -r and use that as a variable? I want to write a script that runs multiple wget commands in it (using -q so i can control output) and then at the end, add up the number of downloaded files and echo…
E Steven
  • 3
  • 2
0
votes
1 answer

How to download files that are dynamically served

I've run into this a few times when a site serves a variable file i try to download wget http://trac-hacks.org/changeset/latest/tracajaxcommentsplugin?old_path=/&filename=tracajaxcommentsplugin&format=zip Expected…
Moak
  • 734
  • 3
  • 10
  • 31
0
votes
2 answers

host resolution extremely slow using wget on local dev machine. fast using browser

On my local Ubuntu 10.04 dev machine, if I do a wget for a web address such as "wget http://www.google.com", it gets stuck on "Resolving www.google.com" for up to 30 seconds before the response is received. If I type www.google.com into a browser,…
Michael B
  • 341
  • 1
  • 2
  • 9
0
votes
1 answer

Wget not working

wget is not working properly on my debian server. if I wget http://www.google.com, it says it cannot resolve the host, when I ping google.com, I receive a successful pong and when I wget http://www.debian.org it works. I cannot wget anything else…
0
votes
1 answer

Am I doing ubuntu 10.0.4 software installation correctly (packages and none-packages)?

Background: I first installed everything using a terminal and "sudo su", i.e. as root. Then I installed jenkins, and then the jenkins user could not run any android sdk tools in /root/androidsdk/tools folder due to "Permission denied" I want to…
user27465
  • 259
  • 2
  • 7