0

I want to be able to retrieve the contents of a log file stored on a server in the form of:

http://[SOME SORT OF ADDRESS]/file.txt

and print it to a shell which can refresh using something like 'watch' to keep a real time track of this log from a shell rather than having to use a broweser. Is there any neat and simple way of doing this (possibly later wrapped up into a python script).

user1228368
  • 117
  • 1
  • 1
  • 5

4 Answers4

2

this will fetch the requested url every every 0.1 seconds and display on console

watch -n 0.1  wget -qO- http://google.com
Marwan Alsabbagh
  • 25,364
  • 9
  • 55
  • 65
0

Yes tail -f works on Linux. However this will only work on local files. If the file is remote you will have to repeatedly fetch it to keep the updated (I think).

noel
  • 2,257
  • 3
  • 24
  • 39
0

Maybe you could try something like this:

watch "wget -N http://[SOME SORT OF ADDRESS]/file.txt &> /dev/null; cat file.txt"

Hans Then
  • 10,935
  • 3
  • 32
  • 51
  • If you repeatedly `wget` the same file, each output will be saved under a new name (`file.txt.?`) so in your example you'll only ever see the first version. Have `wget` output to stdout instead. – Shawn Chin Sep 28 '12 at 10:43
0
while sleep 60; do
    curl address
done

Will print out once a min

Adjust to your liking

minikomi
  • 8,363
  • 3
  • 44
  • 51