We have an application running on Google App Engine using Datastore as persistence back-end. Currently application has mostly 'OLTP' features and some rudimentary reporting. While implementing reports we experienced that processing large amount of data (millions of objects) is very difficult using Datastore and GQL. To enhance our application with proper reports and Business Intelligence features we think its better to setup a ETL process to move data from Datastore to BigQuery.
Initially we thought of implementing the ETL process as App Engine cron job but it looks like Dataflow can also be used for this. We have following requirements for setting up the process
- Be able to push all existing data to BigQuery by using Non streaming API of BigQuery.
- Once above is done, push any new data whenever it is updated/created in Datastore to BigQuery using streaming API.
My Questions are
- Is Cloud Dataflow right candidate for implementing this pipeline?
- Will we be able to push existing data? Some of the Kinds have millions of objects.
- What should be the right approach to implement it? We are considering two approaches. First approach is to go through pub/sub i.e. for existing data create a cron job and push all data to pub/sub. For any new updates push data to pub/sub at the same time it is updated in DataStore. Dataflow Pipeline will pick it from pub/sub and push it to BigQuery. Second approach is to create a batch Pipeline in Dataflow that will query DataStore and pushes any new data to BigQuery.
Question is are these two approaches doable? which one is better cost wise? Is there any other way which is better than above two?
Thank you,
rizTaak