I am just getting started with both, GCP & Google Cloud Data Fusion. Just viewed the intro video. I see that pipelines can be exported. I was wondering how we might promote a pipeline from say, Dev to Prod env? My guess is that after some testing, the exported file is copied to the Prod branch on Git, from where we need to invoke the APIs to deploy it? Also, what about connection details, how do we avoid hard-coding the source/destination configurations & credentials?
Asked
Active
Viewed 729 times
0
-
Here's an interesting thread about how to industrialize the deployment > https://stackoverflow.com/questions/58839608/import-export-datafusion-pipelines/58922941#58922941 – AGI_rev Jun 29 '21 at 09:47
2 Answers
0
About the first question, if you have different environments for development and production, you can export your pipeline and import it in the correct environment.
I didn't understand the second question very well. In the official Data Fusion
plugins there is a standard way to provide your credentials. If you need a better answer, please explain a little more carefully your doubt.

rmesteves
- 3,870
- 7
- 23