Questions tagged [wget]

GNU Wget (or just Wget, formerly Geturl) is a computer program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from World Wide Web and get. It supports downloading via HTTP, HTTPS, and FTP protocols.

GNU Wget (or just Wget, formerly Geturl) is a computer program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from World Wide Web and get. It supports downloading via HTTP, HTTPS, and FTP protocols.

Its features include recursive download, conversion of links for offline viewing of local HTML, support for proxies, and much more. It appeared in 1996, coinciding with the boom of popularity of the Web, causing its wide use among Unix users and distribution with most major GNU/Linux based distributions. Written in portable C, Wget can be easily installed on any Unix-like system.

Source: wikipedia

Man page

290 questions
0
votes
1 answer

wget failed: Connection timed out

I have the following command to copy the website, as it tried to hit sun.com it got connection timed out. I would like the wget to exclude the sun.com so that wget would proceed to the next thing. Exisitng Issue $ wget --recursive --page-requisites…
bal
  • 1
  • 1
  • 2
0
votes
1 answer

Troubleshooting nginx requests not reaching server

When I browse from other servers using wget -U "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)" --spider http://server_ip/page --no-hsts This is the response Spider mode enabled. Check if remote file exists. --2021-07-25 09:03:09-- …
Ajay Singh
  • 297
  • 1
  • 3
  • 13
0
votes
1 answer

Download with Wget only if new version

Good morning, I have a custom software that updates with a custom script.sh. Part of the file goes something like this: if [[ $software == A ]] then echo "downloading package..." rm -rf test.zip >> SoftwareUpdate.log wget --user admin…
Perovic
  • 21
  • 2
0
votes
1 answer

wget -m delete local files that are not existing anymore on the server

I use wget -m to mirror my webspace to my nas. So far it's working great, but I noticed that when a file on the server got deleted, it will be still there locally. Is there any way to delete the local files that aren't existing anymore on the…
jona
  • 113
  • 5
0
votes
1 answer

Can I make curl (or wget) ignore a specific TLS error? (bypassing expired certificate without disabling cert validation)

I'm working on a script which depends on a remote API endpoint which I do not control. Today, my script stopped working, because the endpoint's SSL certificate expired today and they haven't yet fixed it. Running curl -v, I get the following…
Wug
  • 151
  • 1
  • 5
0
votes
0 answers

Wget- Skip hosts that are down

I'm using wget to bulk download a website, and it grabs files from other servers, however, some hosts are down. All wget does is just hang there, so I'd like it to just skip these websites. Is there a flag that I can use to skip downed hosts?
Honest
  • 1
0
votes
1 answer

wget website recusively from localhost without using bandwith

i'm looking to download recusively my wordpress website into static using wget, the problem is that whenever i do that, it's using too much bandwith (3.5gb) even though i end up downloading 20mb which weird, so i'm looking to download using…
logax
  • 129
  • 3
  • 14
0
votes
0 answers

Read error (Connection reset by peer) in headers on Vagrant

Trying to install aaPanel into a vagrant machine. vagrant box used is Scotch box 3.5 which preloaded with php v7.0. I've updated php to v7.4 using sury's ppa and working fine. I've used given instruction on here, and used Ubuntu specific commands as…
Vishwa
  • 101
  • 1
  • 6
0
votes
0 answers

How to download all files from an FTP directory?

I am not experienced in unix or downloading files from FTP, html etc. so haven't been sure if I can safely use some other examples/questions given here. I am looking to download files from an ftp directory into my server, with getting all the files…
DN1
  • 101
0
votes
0 answers

curl + wget cannot make https connections

My Ubuntu 16.04 system has the following problem: wget and curl hang indefinitely when connecting to a server with HTTPS Both programs work fine with HTTP Example for curl: curl -vv https://google.com * Rebuilt URL to: https://google.com/ * …
oarfish
  • 51
  • 4
0
votes
0 answers

scripting download of teradata files

I'm trying to automate the build of a server and am having troubles scripting out the download of the teradata tools and utilities for linux from this site…
0
votes
1 answer

wget specific directory with downloading whole path

I try to download data using this command. wget -r --no-parent http://myserver/username/data/ This leaves me with three folders on my server, namely: myserver/username/data/ How do I download only data without the whole path of folders?
user2300940
  • 121
  • 1
  • 4
0
votes
2 answers

Use wget on a cluster with ssh-tunnel

Normally I can sercure copy files from one machine to another using > scp -oProxyJump=user@login.node.org ssh user@main.node.org:/home/user/my_files/* . which is very slow for large data sets. I was told that the machines I am using has a very fast…
-1
votes
1 answer

stdout and stderr from bash execution to other machine?

Is it possible to send stderr or stdout from a linux machine to ubuntu linux machine using at best wget or other common package at worse ( hopefully not ssh access ) ? So far i can find a way to "stream" stdout and stderr to other machine.
eugeneK
  • 410
  • 2
  • 8
  • 18
-1
votes
1 answer

Stop wget clobbering local file is server unavailable

I have a cron job on a server which once a day uses wget to download "Earth Orientation Parameters" and leapsecond data from a NASA server. Specifically: wget https://gemini.gsfc.nasa.gov/500/oper/solve_apriori_files/usno_finals.erp -O .eops wget…
Chris
  • 105
  • 5