Questions tagged [wget]

GNU Wget (or just Wget, formerly Geturl) is a computer program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from World Wide Web and get. It supports downloading via HTTP, HTTPS, and FTP protocols.

GNU Wget (or just Wget, formerly Geturl) is a computer program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from World Wide Web and get. It supports downloading via HTTP, HTTPS, and FTP protocols.

Its features include recursive download, conversion of links for offline viewing of local HTML, support for proxies, and much more. It appeared in 1996, coinciding with the boom of popularity of the Web, causing its wide use among Unix users and distribution with most major GNU/Linux based distributions. Written in portable C, Wget can be easily installed on any Unix-like system.

Source: wikipedia

Man page

290 questions
1
vote
3 answers

Cannot download some files from a given server

I'm working on a Fedora Core release 6, and whenever I want to download a given file from a FTP site, the connection is refused: $ curl --ftp-pasv "ftp://ftp.ensembl.org/pub/current_mysql/vega_mart_56/CHECKSUMS.gz" curl: (7) couldn't connect…
Pierre
  • 429
  • 1
  • 5
  • 14
1
vote
1 answer

wget giving Segmentation fault

I recently did apt-get update followed by apt-get upgrade on my Debian VPS server. Suddenly (at least) wget stopped working with error Segmentation fault. It's not even trying to do something. So after some googling I've installed gdb and tried to…
MaRmAR
  • 11
  • 1
  • 2
1
vote
0 answers

Server socket sitting in FIN_WAIT_1 whilst network doesn't see its traffic

We've been trying to get a grip on a really weird problem where we can wget a page from Apache 2.2.19 on solaris 10 and some permutations of requests reliably take various fixed lengths of time to respond. It looks to be based around the closing of…
Chris Phillips
  • 254
  • 4
  • 15
1
vote
0 answers

why is a wget to http://graph.facebook.com resolving to my server's IP?

This makes absolutely no sense and I spent 3 hours trying to figure this out. Suddenly all my scripts connecting to Facebook stopped working, so I tried to do this: root@s01 [~]# wget http://graph.facebook.com --2015-11-18 15:27:47-- …
1
vote
1 answer

Which Ubuntu package has the CA for debian.neo4j.org (godaddy)?

The neo4j site shows that you should get their certificate using wget -q -O - http://debian.neo4j.org/neotechnology.gpg.key This, of course, could allow their certificate to be hacked. So, I really should use…
1
vote
1 answer

wget and pipe to tar in the background

I am trying to download a large file which is .tar.gz and (> 2GB) in size. The bash script also does lots of other things, so I want to kick off the download and then continue processing other commands in the bash script. I am using wget and piping…
Justin
  • 5,328
  • 19
  • 64
  • 84
1
vote
2 answers

wget shows 404 not found while the page is live

This looks simple but I don't know how to fix. Open the following URL in your browser: https://ecomm.sella.it/gestpay/gestpayws/WSCryptDecrypt.asmx?WSDL This seems working and this will download and XML WSDL in your browser. But on our linux server…
1
vote
2 answers

Why does downloading a large file with wget seem to consume nearly all available RAM?

I am trying to download a file of 580MB with the following simple wget command: wget http://example.com/file.ext The server has 16 GB installed RAM of and during the entire download the memory usage increase from 10% to up to 99%. How is it…
markjfekjfe
  • 29
  • 1
  • 2
1
vote
3 answers

Downloading php files from python simple http server

I started python -m SimpleHTTPServer on one computer on lan and used wget to download php files from it to another. As far as i see, they seem to be downloaded correctly - i got php sources instead of html layout. Why? Is this because this server…
Phil
  • 1,969
  • 6
  • 29
  • 33
1
vote
1 answer

Ghost running on docker container does not respond to http requests from other container

I am setting up a ghost blog instance inside a docker container. I bound the ghost server to host 0.0.0.0 and exposed the port on which it's listening (port 2368). If I run wget to the ghost container's ip and ghost's port from the outside, the…
alejo.90
  • 11
  • 2
1
vote
1 answer

Use tmux for managing multiple downloaders as a supervisorctl service?

I have many data-servers I need to download data from via http as soon as it is available. For each server I start a bash "while true"-loop and within that a wget to poll the server for new data. To start all the bashs I created a tmux config…
AME
  • 135
  • 6
1
vote
2 answers

Allow connection from localhost to the HTTPS/Port 443 Protocol

We've recently set up a new development server and have been experiencing problems when trying to connect to an API via an instance of SoapClient. I think I've narrowed down the issue to being the fact the new server can't establish a connection to…
Ryan
  • 111
  • 1
  • 3
1
vote
1 answer

On Debian server, downloads stall out after one or two seconds using apt-get

I have a Debian server that is used primarily as a very low-traffic web server. Recently it began having some trouble regarding its network connection. I am able to use SFTP to upload a file to this server from an outside connection and the speed is…
norova
  • 21
  • 1
1
vote
1 answer

Command line tools to monitor POST/GET requests of a loaded website

When I load a website with Firefox and open up the Network Profiler, I can see all the POST and GET requests the site is making. I would like this functionality from the command line. Does any one know of a command line tool which I could use to…
jaynp
  • 235
  • 3
  • 9
1
vote
1 answer

Why does wget from ftp-sites sometimes create a few empty files when it downloads the rest of the files without problems?

I have downloaded 80.000+ image files stored on a FTP-site using wget(1) and most of the files where downloaded without problems but when I did a small check (using file(1)) I just realized that a few of the files were empty!?! I just used this…
Andrew Rump
  • 63
  • 11