-4

Context:

The Data warehouse is built using SAP BW On HANA. It is operational for quite some time. It has over 50TBs of data. As a strategic decision the data has to be migrated on GCP to BigQuery (one time activity).

Question:

What is the most optimum, non-intrusive way to export such huge data, adopt it to BigQuery format. uploading that is not an issue as it can be uploaded with dedicated Cloud interconnect.

Enrique Zetina
  • 825
  • 5
  • 16
  • 1
    Your question lacks details and might be too broad. For example, you state "adopt it to BigQuery format". What format is it now? What does "most optimum, non-intrusive way" mean to your company? Show the research that you have already completed and what problem you are having. Today, 100 Mbyte speeds are easy to achieve. Moving/translating/importing 50 TB of data still takes hundreds of machine-hours to perform. That means "non-intrusive" could be an issue. – John Hanley Mar 23 '21 at 19:07

1 Answers1

1

As per Google best practice you should use Transfer Appliance to transfer data.

But if you wanted to try dedicated Cloud interconnect, you can try it. Transferring data : You can try following option :-

  1. Take export of data from HANA DB over a file.
  2. Transfer file to Cloud storage through your desired channel.
  3. Once you have data on GCS, you can upload data to Bigquery through Multiple options (Export Data from GCS, Create Dataflow Jobs, & Script to upload data to BQ)
Vibhor Gupta
  • 670
  • 7
  • 16