Questions tagged [wget]

GNU Wget (or just Wget, formerly Geturl) is a computer program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from World Wide Web and get. It supports downloading via HTTP, HTTPS, and FTP protocols.

GNU Wget (or just Wget, formerly Geturl) is a computer program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from World Wide Web and get. It supports downloading via HTTP, HTTPS, and FTP protocols.

Its features include recursive download, conversion of links for offline viewing of local HTML, support for proxies, and much more. It appeared in 1996, coinciding with the boom of popularity of the Web, causing its wide use among Unix users and distribution with most major GNU/Linux based distributions. Written in portable C, Wget can be easily installed on any Unix-like system.

Source: wikipedia

Man page

290 questions
1
vote
1 answer

SSL error curl/wget unknown protocol/wrong version number

I've been trying to use local hosted https urls in command line/cron jobs and i get those errors. curl output is similar. The same commands, when used in other servers works perfectly. Default [root@tejon ~]# wget -O /dev/null…
Fábio Carneiro
  • 59
  • 1
  • 1
  • 7
1
vote
1 answer

Why would an HTTP(S) request not be logged in IIS?

I'm troubleshooting a batch file that uses Wget to send requests to a website running in IIS 7. The batch file runs Wget twice, the first time to login to the site via a POST, the second to run some maintenance code in the site via a GET. The server…
Kenny Evitt
  • 209
  • 4
  • 12
1
vote
2 answers

Download and save a file with proof that it was downloaded from a specific server/url on a specific date?

Assuming the file was downloaded over http is this possible? I'm guessing there could be some way to do it using a hash and a timestamp however I'm not sure how you could prove nothing was tampered with including the downloaded file itself. EDIT: As…
ajw0100
  • 111
  • 3
1
vote
3 answers

error while executing bash script -> command not found

i made this script here, and get the error IA-exporto.sh: 13: wget: not found i tried changing the " by ` and mixed all together and rearranged, but it just won't do.. #!/bin/bash UNAME="maximilian" PWD="password" DATE=`date…
Harrys Kavan
  • 402
  • 1
  • 5
  • 19
1
vote
3 answers

Problems with wget AWS ubuntu apache2

I am trying to use wget to download from my AWS ubuntu server with apache2. I have tried several different options but they all result in either a file by the directory name, or an index.html file. There are 3 pictures and an ogg format video in the…
mrhobbeys
  • 93
  • 6
1
vote
3 answers

cron job running repeatedly, why?

Why is this cron job executing repeatedly over time and what can I do to stop it? I have a cron job that is supposed to run at 4 each morning. It hits a php script that executes some daily data analysis and under normal conditions runs once (taking…
Lothar_Grimpsenbacher
  • 1,677
  • 3
  • 19
  • 29
1
vote
1 answer

SElinux stopping LVS from working with https

LVS/piranha is setup and trying to get it to balance https instead of http. Setup https testing with wget - idea from this link. Works when I do it at the command prompt. With SELinux enforcing, the wget fails to run due to the lack of access to…
J Hoskins
  • 11
  • 1
1
vote
3 answers

Setting wget permission to 755 so users other than root can execute it a big security risk?

I read recently in blogs that by default wget on linux is 750, so only root can execute it. I would like to allow users wget and change it to 755, but I read around the web that it is a big security risk..
giorgio79
  • 1,837
  • 9
  • 26
  • 36
1
vote
2 answers

Changed DNS, no effect locally

I know it can take up to a few days for DNS changes to take effect, but this has me baffled, maybe somone can offer a plausible explanation.. $ wget http://***OLDIP***/ -O oldserver.html --2012-07-25 16:31:19-- http://***OLDIP***/ Connecting to…
1
vote
1 answer

Serve mirrored (static) web-page with original headers

I have a dynamic webpage which I want to create a "frozen" copy of. Typically I would do something like wget -m http://example.com, and then put the files in the document root of the web-server. This site however has some dynamic content, including…
aioobe
  • 371
  • 1
  • 4
  • 16
1
vote
1 answer

iptables redirect tcp to checkip.dyndns.org from localhost to 127.0.0.1:8118

I've tried several different combinations of rules and nothing seems to be working. I know that you can't use prerouting table for a request coming from the localhost so I used the output table and that just returns errors when using wget. I have…
shawn
  • 13
  • 2
1
vote
2 answers

Using wget and Awk to count similar expressions

I am trying to create a script that uses wget to download a data set and then awk to sort though the file and tell you the most common filter used which is $14 column. So far I have the wget function working as seen below, wget -O-…
kevin jack
  • 11
  • 2
1
vote
2 answers

406 Not Acceptable error when runing wget

I have this command in crontab: wget --quiet --delete-after http://boms.ro/admincp/cron/s/9abf0f42c1e4f55fdb87d8237cdde And when I run it with the --debug argument I get the following response: Caching boms.ro => 188.240.2.30 Created socket…
sica07
  • 105
  • 1
  • 10
1
vote
2 answers

How to untar a single file from an archive with randomly generated directory names?

I'm trying write a script that will download a tarball from github and extract a single file from it. However the top level directory inside the tarball has some random characters in it, which I think change when the repo/tarball is updated,…
bgibson
  • 121
  • 4
1
vote
2 answers

Cron not running commands as root

I have a file called /scripts/checkInternet and it contains: #!/bin/bash WGET="/usr/bin/wget" rm /tmp/index.google $WGET -q --tries=10 --timeout=5 http://www.google.com -O /tmp/index.google &> /dev/null if [ ! -s /tmp/index.google…
user74078