2

I have a block of code within a try/except block that throws/catches exceptions(psycopg2 in one case and a CRCError in another). The exception is logged using the standard library(sample code below).

GCP error reporting is logging the caught exception (lacking context).

How can we accomplish the following?

  • Make sure that my exception with context exception explicitly shows up in the error reporting console. - I actually know the answer here is to [explicitly use the error reporting client][1] as I've done this elsewhere or allow the exception to be uncaught (undesirable in this case as it would change the executionflow).
  • prevent GCP from explicitly logging these "Errors" - This is the part that has me stumped. I don't want to explicitly swallow the caught exception's stack trace. Does this have something to do with the way I've configured logging for the entire application? Note how GCP Logs Explorer shows 2 separate entries below.

Sample code that handles/logs an OSError:

try:
    with open(out_path, 'wb') as fout:
        with gzip.open(in_path, 'rb') as fin:
            shutil.copyfileobj(fin, fout)
except Exception:
    LOGGER.exception("Fatal error decompressing %s to %s, skipping", in_path, out_path)

Corresponding error reporting entry:

OSError: CRC check failed 0xa3d2ba37 != 0xb3b0d715
at _read_eof (/usr/local/lib/python3.7/gzip.py:512)
at read (/usr/local/lib/python3.7/gzip.py:465)
at readinto (/usr/local/lib/python3.7/_compression.py:68)
at read (/usr/local/lib/python3.7/gzip.py:287)
at copyfileobj (/usr/local/lib/python3.7/shutil.py:79)
at gunzip (xxx:NN)

Logs Explorer showing two separate entries:

{
    "textPayload": "2021-12-09 20:49:08,918 - __main__ - ERROR - Fatal error decompressing 0001.bin.gz to 0001.bin, skipping\n",
    "insertId": "insertId1",
    "resource": {
        "type": "k8s_container",
        ...
    },
    "timestamp": "2021-12-09T20:49:08.918859648Z",
    "severity": "ERROR",
    "logName": "projects/myproject/logs/stderr",
    "receiveTimestamp": "2021-12-09T20:49:10.383920475Z"
}
{
    "textPayload": "Traceback (most recent call last):\n  File \"xxx.py\", line NN, in gunzip\n    shutil.copyfileobj(fin, fout)\n  File \"/usr/local/lib/python3.7/shutil.py\", line 79, in copyfileobj\n    buf = fsrc.read(length)\n  File \"/usr/local/lib/python3.7/gzip.py\", line 287, in read\n    return self._buffer.read(size)\n  File \"/usr/local/lib/python3.7/_compression.py\", line 68, in readinto\n    data = self.read(len(byte_view))\n  File \"/usr/local/lib/python3.7/gzip.py\", line 465, in read\n    self._read_eof()\n  File \"/usr/local/lib/python3.7/gzip.py\", line 512, in _read_eof\n    hex(self._crc)))\nOSError: CRC check failed 0xa3d2ba37 != 0xb3b0d715\n",
    "insertId": "insertId2",
    "resource": {
        "type": "k8s_container",
        ...
    },
    "timestamp": "2021-12-09T20:49:08.918931417Z",
    "severity": "ERROR",
    "logName": "projects/myproject/logs/stderr",
    "receiveTimestamp": "2021-12-09T20:49:10.383920475Z"
}
JaysonM
  • 596
  • 1
  • 10
jwan
  • 135
  • 7

1 Answers1

0

You can try using Sink to set exclusion filters. This will help you exclude matching log entries from being routed to the sink's destination or from being ingested by Cloud Logging.

Click this link for the full guide on creating sink.

  1. In the Cloud Console, go to the Logging > Log Router page.
  2. Select an existing Cloud project.
  3. Select Create sink
  4. Enter Sink name and Description.
  5. Select Sink Destination
  6. Choose logs to include in sink
  7. Choose logs to exclude from the sink
    a. In the Exclusion filter name field, enter a name.
    b. In the Build an exclusion filter field, enter a filter expression that matches the log entries you want to exclude. You can also use the sample function to select a portion of the log entries to exclude.

Here's the link for filter examples.

JaysonM
  • 596
  • 1
  • 10