-1

We have a streaming solution that takes messages from a pubsub topic and uses DataFlow to stream each message into a BigQuery table. This is a very appropriate use case for BigQuery.

We would also like to take a subset of those messages and make them available in BigTable. My thinking is that we could read each desired message from the pubsub topic using a cloud function and then call the BigQuery API to insert a row into BugTable. We have hundreds of such messages per hour.

My question is, is it appropriate to make many calls to the BigTable API, each inserting a single row? Am I thinking about this all wrong?

jamiet
  • 10,501
  • 14
  • 80
  • 159
  • 2
    OK, think I can probably answer my own question here, this code sample: https://cloud.google.com/bigtable/docs/writing-data#simple suggests writing a single row is appropriate. – jamiet Jun 22 '21 at 20:05

1 Answers1

0

Based on the comment below the question, OP found their answer in the public GCP docs.

It seems that BigTable does support single row writes. The docs include a sample for multiple programming languages such as Golang, PHP, Python, Java, Node.js, HBase, C#, C++ and Ruby.

Kevin Quinzel
  • 1,430
  • 1
  • 13
  • 23