We have a streaming solution that takes messages from a pubsub topic and uses DataFlow to stream each message into a BigQuery table. This is a very appropriate use case for BigQuery.
We would also like to take a subset of those messages and make them available in BigTable. My thinking is that we could read each desired message from the pubsub topic using a cloud function and then call the BigQuery API to insert a row into BugTable. We have hundreds of such messages per hour.
My question is, is it appropriate to make many calls to the BigTable API, each inserting a single row? Am I thinking about this all wrong?