5

I have a grafana Loki logs that's in my cluster. I am able to see my logs but as at the moment, the cluster is no longer in use and I would like to delete it but I still have some logs I would like to extract Loki and maybe store it locally on my system, or Azure bucket.

Is there a way to extract this logs and save locally or azure bucket. I used loki helm to setup my Loki, promethus any help is appreciated

Dominik
  • 2,283
  • 1
  • 25
  • 37
King
  • 1,885
  • 3
  • 27
  • 84

2 Answers2

11

You could use logcli to connect to Loki. See this documentation.

Example command:

port-forward <my-loki-pod> 3100
logcli query '{ job="foo"}'  --limit=5000 --since=48h -o raw > mylog.txt
User12547645
  • 6,955
  • 3
  • 38
  • 69
0

Grafana Loki limits the number of log lines it can return in a single query response. By default this limit is set to 5000. This limit applies both to the HTTP query API and to logcli. This limit may prevent from exporting all the logs from Grafana Loki if it contains billions of log lines :(

You can use an alternative database for logs I work on, which allows querying all the logs in a streaming manner with a simple * query according to these docs.

valyala
  • 11,669
  • 1
  • 59
  • 62