0

I built a ML training pipeline in which I used ohe=OneHotEncoding() for the training data. Now I want to transform the inference data using the same ohe object. How can I make this ohe object available in my inference job? I am using Azure ML SDK v2.

I am trying to find out ways to make sure that the ohe object is available in the Inference Deployment , but cant find out anything.

amit.s
  • 31
  • 2
  • Can you please add more details test data that you are passing to the inference job? – Ram Mar 30 '23 at 04:45

1 Answers1

0

The model artifacts can be downloaded using az ml model download - that will get all of the files.

Create production ML pipelines with Python SDK v2 - Tutorial

Ram
  • 2,459
  • 1
  • 7
  • 14