-1

The logs are logged via stackdriver through Stackdriver-logging api [Python]. The logs are viewed in the "Global" resource section. A sink is then created with a google cloud storage bucket [e.g. my-bucket] and the logs are exported.

Questions:

1) Where are these logs getting stored ?

2) How do I make the code resource specific, rather than the logs going to "Global" specific ?

3) Not able to see the exported logs in "my-bucket", but the sink is created under 'exports' tab ?

4) How to archive/purge logs in GCS using stackdriver ?

1 Answers1

0

1) New log entries that match your sink will start being exported. Log entries going to BigQuery or Cloud Pub/Sub are streamed to those destinations immediately. Log entries going to Cloud Storage are batched and sent out approximately every hour.

2) I'm not sure that I understand your question, can you please rephrase it, thanks.

3) Log entries going to Cloud Storage are batched and sent out approximately every hour.

4) You can use log exclusions, there are two kinds of exclusions:

  • Resource type exclusions, that block all logs from specific resource types.
  • Exclusion filters are more flexible.

You may also create an script to purge old logs.

Pauloba
  • 566
  • 2
  • 14