11

My log files have one json object per line. I use [json][1] to get a human readable output via

cat mylog.log | json -a field1 field2

Now I would like to have

tail -F mylog.log | json -a field1 field2

for a continuous output. But this seems not to work, the shell simply hangs. If I use &| to avoid buffering issues, the output is as if I only run cat.

mylog.log looks like this:

{"field1": entry1a, "field2": entry2a, "field3": entry3a}
{"field1": entry1b, "field2": entry2b, "field3": entry3b}

Any suggestions?

[1] https://github.com/trentm/json

osdf
  • 818
  • 10
  • 20

1 Answers1

15

It's looks like json first loads the whole stdin into a buffer and only then processes the data, but you should still be able to achieve stream processing by calling it for each of the lines added to the log file, something like this:

tail -F mylog.log | while read line; do echo "$line" | json -a field1 field2; done
Jakub Roztocil
  • 15,930
  • 5
  • 50
  • 52
  • 1
    I had some json with escaped backslashes - class:"path\\to\\class". To preserve these, I had to change the read statement to read -r line, which preserves them – Dan Straw Jan 28 '15 at 14:02