1

I am trying to create an MLOps Pipeline using Azure DevOps and Azure Databricks. From Azure DevOps, I am submitting a Databricks job to a cluster, which trains a Machine Learning Model and saves it into MLFlow Model Registry with a custom flavour (using PyFunc Custom Model).

Now after the job gets over, I want to export this MLFlow Object (with all dependencies - Conda dependencies, two model files - one .pkl and one .h5, the Python Class with load_context() and predict() functions defined so that after exporting I can import it and call predict as we do with MLFlow Models).

How do I export this entire MLFlow Model and save it as an AzureDevOps Artifact to be used in the CD phase (where I will deploy it to an AKS cluster with a custom base image)?

Anirban Saha
  • 1,350
  • 2
  • 10
  • 38

2 Answers2

2

There is no official way to export a Databricks MLflow run from one workspace to another. However, there is an "unofficial" tool that does most of the job with the main limitation being that notebook revisions linked to a run cannot be exported due to lack of a REST API endpoint for this.

https://github.com/amesar/mlflow-export-import

Andre
  • 304
  • 1
  • 2
0

Probably you needn't to use the artifacts, there is an azure devops extension (Machine Learning), it can access artifacts in the AzureML workspace, and trigger the release pipeline. You can refer to link below for the steps: https://github.com/Azure-Samples/MLOpsDatabricks/blob/master/docs/release-pipeline.md

wade zhou - MSFT
  • 1,397
  • 1
  • 3
  • 6