I have about 45GB of data in my BigQuery that I want to transfer to Elasticsearch. Currently, I am fetching rows from my BigQuery table as JSON then indexing them into Elasticsearch. The whole process took about 2 weeks to complete. I just wanted to know if there is a better and more efficient way to do this.
Asked
Active
Viewed 415 times
0
-
the 45GB is single file or chunk of file – Thangarajan Pannerselvam Jun 22 '20 at 09:46
-
It is chunk of file. – Achal Gambhir Jun 22 '20 at 11:27
-
1try to use generator – Thangarajan Pannerselvam Jun 22 '20 at 11:54
-
@ThangarajanPannerselvam Could you please elaborate on that. I am not much familiar with generator. – Achal Gambhir Jun 22 '20 at 13:36
-
1Hi! I would like to ask you to take a look for following thread in Stackoverflow: https://stackoverflow.com/questions/39252484/elastic-search-with-google-big-query where you can find different scenarios. Let me know if it's sufficient for your needs. – aga Jun 26 '20 at 07:16