4

I want to read BigQuery log entries to do some analysis. But I can't seem to get the protoPayload.value decoded. I've tried messing around with the google-proto-files and protocol-buffers packages, but I think I'm missing something really obvious here...

const Logging = require('@google-cloud/logging');
const protobuf = require('protocol-buffers');
const protoFiles = require('google-proto-files');


const protoPath = './node_modules/google-proto-files/google/cloud/audit/audit_log.proto';
const root = protoFiles.loadSync(protoPath)
const AuditLog = root.lookup('google.cloud.audit.AuditLog');

const client = new Logging.v2.LoggingServiceV2Client({ projectId });
client.listLogEntriesStream({resourceNames, filter, pageSize})
    .on('data', entry => {
        console.log(entry); // Entry is of type AuditLog
        console.log(AuditLog.decode(entry.protoPayload.value.buffer));
        process.exit(1)
    })
    .on('error', e => console.error(e))
    .on('end', () => console.info('END RECEIVED', arguments))

I do receive messages with protoPayloads, but the error I receive when attempting to decode the message is this:

Error: no such Type or Enum 'google.rpc.Status' in Type .google.cloud.audit.AuditLog

The actual question: What is the proper way to decode the protoPayload field in a LogEntry?

Thanks!

Melle
  • 7,639
  • 1
  • 30
  • 31

1 Answers1

1

Since the entry.protoPayload.value is a serialized proto (an AuditLog message) you should be able to handle it with the deserializeBinary() method documented at https://developers.google.com/protocol-buffers/docs/reference/javascript-generated#message That 'protocol-buffers' npm doesn't appear to be from Google, and the proto compiler will generate code for deserializing already.

I wouldn't expect you'd need to, but you could also try loading the "google/rpc/status.proto" definition explicitly.

Kirk Kelsey
  • 4,259
  • 1
  • 23
  • 26