1

I am writing a new cloud function and am using the new Google Cloud Logging library as announced at https://cloud.google.com/blog/products/devops-sre/google-cloud-logging-python-client-library-v3-0-0-release.

I am also using functions-framework to debug my code locally before pushing it to GCP. Setup and Invoke Cloud Functions using Python has been particularly useful here.

The problem I have is that when using these two things together I cannot see logging output in my IDE, I can only see print statements. Here's some sample code that does NOT use google.cloud.logging, this DOES successfully cause logging output to appear in my terminal:

from flask import Request
from google.cloud import bigquery
from datetime import datetime
import google.cloud.logging
import logging

logger = logging.getLogger()
logger.setLevel(logging.INFO)

def main(request) -> str:
    #
    # do stuff to setup a bigquery job
    #
    bq_client = bigquery.Client()

    job_config = bigquery.QueryJobConfig(labels={"key": "value"})
    nowstr = datetime.now().strftime("%Y%m%d%H%M%S%f")
    job_id = f"qwerty-{nowstr}"

    query_job = bq_client.query(
        query=export_script, job_config=job_config, job_id=job_id
    )
    print("Started job: {}".format(query_job.job_id))
    query_job.result()  # Waits for job to complete.
    logging.info(f"job_id={query_job.job_id}")
    logging.info(f"total_bytes_billed={query_job.total_bytes_billed}")

    return f"{query_job.job_id} {query_job.state} {query_job.error_result}"

Here is the output

Started job: qwerty-20220306211233889260
INFO:root:job_id=qwerty-20220306211233889260
INFO:root:total_bytes_billed=31457280

As you can see the call to print(...) has outputted to my terminal and the call to logging.info(...) has as well.

If I change the code to also use google.cloud.logging

from flask import Request
from google.cloud import bigquery
from datetime import datetime
import google.cloud.logging
import logging

logger = logging.getLogger()
logger.setLevel(logging.INFO)
log_client = google.cloud.logging.Client()
log_client.setup_logging()

def main(request) -> str:
    #
    # do stuff to setup a bigquery job
    #
    bq_client = bigquery.Client()

    job_config = bigquery.QueryJobConfig(labels={"key": "value"})
    nowstr = datetime.now().strftime("%Y%m%d%H%M%S%f")
    job_id = f"qwerty-{nowstr}"

    query_job = bq_client.query(
        query=export_script, job_config=job_config, job_id=job_id
    )
    print("Started job: {}".format(query_job.job_id))
    query_job.result()  # Waits for job to complete.
    logging.info(f"job_id={query_job.job_id}")
    logging.info(f"total_bytes_billed={query_job.total_bytes_billed}")

    return f"{query_job.job_id} {query_job.state} {query_job.error_result}"

Then I don't see the logging output in the terminal:

Started job: provider-egress-hastings-20220306211718088936

Is there a way to redirect logging output to my terminal when running locally using functions-framework and when using google.cloud.logging but not affect logging when the function is running as an actual cloud function in GCP?

JaysonM
  • 596
  • 1
  • 10
jamiet
  • 10,501
  • 14
  • 80
  • 159
  • 1
    There may be a way but I'm unaware of one. Are you committed to using the Cloud Logging libraries? For writing logs, I'm unsure whether there's any value to them. Cloud Functions (and Google's other compute services) will capture stdout and stderr to Cloud Logging logs too. See [Writing runtime logs](https://cloud.google.com/functions/docs/monitoring/logging#functions-log-helloworld-python). One approach that you may want to consider is using [structured logs](https://cloud.google.com/functions/docs/monitoring/logging#functions-log-structured-python) – DazWilkin Mar 07 '22 at 02:04
  • I'm unsure too. We currently run a lot of Cloud Functions and we do exactly what you say, write to stdout and simply have that appear in the logs. And we do write structured logs as per the link you provided (and build logs-based metrics from them). Previously we discounted google's logging library, mainly for perf reasons, however we're evaluating this latest release to see if there are any compelling reasons to use it. – jamiet Mar 07 '22 at 08:52
  • I think (!?) unless you need them, you likely shouldn't use them. They add additional complexity (e.g. your question), more maintenance etc. I think (!?) the benefit is in being able to do something that you couldn't otherwise do (e.g. more exquisite output control, reading logs etc.). – DazWilkin Mar 07 '22 at 18:09

1 Answers1

0

i'm using an environment variable in local testing to configure a console handler:

log_client = google.cloud.logging.Client()
log_client.setup_logging(log_level=logging.DEBUG)
logger = logging.getLogger()
if os.getenv("LOCAL_LOGGING", "False") == "True":
    # output logs to console - otherwise logs are only visible when running in GCP
    logger.addHandler(logging.StreamHandler())
hincha
  • 1
  • 1
  • As it’s currently written, your answer is unclear. Please [edit] to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Blue Robin Mar 08 '23 at 19:26