5

Is there a way to tail a resource such as http://someserver.com/logs/server.log ?

This is what I would do for a local file.

tail -F /var/logs/somefile.log

I would like something similar to that for a file accessed through the http protocol

Rythmic
  • 151
  • 1
  • 4
  • 1
    No. Theoretically, you could hack something together using byte offsets and curl or wget, but it'd be horrible. What are you trying to achieve? Perhaps there's a better way, – SmallClanger Apr 25 '13 at 13:08
  • I have a logfile on a url which is my only way to access it. I want a nice way to view it in real-time – Rythmic Apr 25 '13 at 13:21
  • Go for Log Management tools like Logstash ,Awstats, Webalyzer or Splunk.. – Abhishek Anand Amralkar Apr 25 '13 at 13:35
  • I home somebody already tested [url-tail](https://github.com/maksim07/url-tail) – Jin Kwon Jan 13 '15 at 06:47
  • Related: [Is there a way to perform a "tail -f" from an url?](https://stackoverflow.com/q/31293629/95735), [Tail a text file on a web server via HTTP](https://stackoverflow.com/q/6189549/95735), [tail -f equivalent for an URL](https://superuser.com/q/514066/664), [How to tail file from url without downloading the entire file?](https://superuser.com/q/1267867/664) – Piotr Dobrogost Mar 09 '22 at 17:44

5 Answers5

3

I wrote a simple bash script to fetch URL content each 2 seconds and compare with local file output.txt then append the diff to the same file

I wanted to stream AWS amplify logs in my Jenkins pipeline

while true; do comm -13 --output-delimiter="" <(cat output.txt) <(curl -s "$URL") >> output.txt; sleep 2; done

don't forget to create empty file output.txt file first

: > output.txt

view the stream :

tail -f output.txt

Better solution:

I found better solution using wget here:

while true; do wget -ca -o /dev/null -O output.txt "$URL"; sleep 2; done

https://superuser.com/a/514078/603774

2

This will do it:

#!/bin/bash

file=$(mktemp)
trap 'rm $file' EXIT

(while true; do
    # shellcheck disable=SC2094
    curl --fail -r "$(stat -c %s "$file")"- "$1" >> "$file"
done) &
pid=$!
trap 'kill $pid; rm $file' EXIT

tail -f "$file"

It's not very friendly on teh web-server. You could replace the true with sleep 1 to be less resource intensive.

Like tail -f, you need to ^C when you are done watching the output, even when the output is done.

Brian
  • 31
  • 3
  • You don't need to stat the file. From `man curl` – *-C Continue/Resume a previous file transfer at the given offset. Use "-C -" to tell curl to automatically find out where/how to resume the transfer. It then uses the given output/input files to figure that out.* – Piotr Dobrogost Mar 09 '22 at 20:34
2

I'm trying htail (https://github.com/vpelletier/htail) and it seems to do the job pretty well.

0

You might use the script from https://stackoverflow.com/a/1102346/401005 if you are able to run php scripts. You should add a flush(); after the echo.

When using curl --no-buffer http://the/url you should have an suitable output

krissi
  • 3,387
  • 1
  • 19
  • 22
  • -1 This just downloads the resource and quits. Tailing means keeping an eye on the growing resource and retrieving new content when it appears. – Piotr Dobrogost Mar 09 '22 at 17:51
-1

If you want a sentinel program type like I needed ( only update when you respond )

#!/bin/bash

while true; do
    echo "y to curl, n to quit"
    read -rsn1 input

    if [ "$input" = "y" ]; then
        curl $1
    elif [ "$input" = "n" ]; then
        break
    fi
done

Then run curlTail.sh http://wherever

You'll get

18:02:44] $: ~ \> curlTail.sh http://wherever
y to curl, n to quit

When you hit y, it'll curl it, then prompt you again.

  • -1 This just downloads whole resource each time. Tailing means keeping an eye on the growing resource and retrieving **new** content when it appears. – Piotr Dobrogost Mar 09 '22 at 17:53