0

I want to build a Pipeline in Azure ML. The Training pipeline runs well. Training:

training_pipeline_steps = AutoMLPipelineBuilder.get_many_models_train_steps(
    experiment=experiment,
    train_data=full_dataset,
    compute_target=compute_target,
    node_count=1,
    process_count_per_node=4,
    train_pipeline_parameters=hts_parameters,
    run_invocation_timeout=3900,
)

Forecasting:

from azureml.train.automl.runtime._hts.hts_parameters import HTSInferenceParameters

inference_parameters = HTSInferenceParameters(
    hierarchy_forecast_level="Material",  # The setting is specific to this dataset and should be changed based on your dataset.
    allocation_method="proportions_of_historical_average",
)

steps = AutoMLPipelineBuilder.get_many_models_batch_inference_steps(
    experiment=experiment,
    #inference_data=registered_inference,
    inference_data = full_dataset,
    compute_target=compute_target,
    inference_pipeline_parameters=inference_parameters,
    node_count=1,
    process_count_per_node=4,
    arguments=["--forecast_quantiles", 0.1, 0.9],
)

But I always get the error on Forecasting: enter image description here

Can anybody help with that error? Thank you!

JMTaverne
  • 31
  • 5
  • Can you check in log file `user_logs/std_log_0.txt`.What error it is giving? – JayashankarGS Jul 05 '23 at 12:14
  • @JayashankarGS: .txt file shows following: `Azure Machine Learning Batch Inference Start [2023-07-05 06:56:43.233779] No started flag set. Skip creating started flag. Azure Machine Learning Batch Inference End Cleaning up all outstanding Run operations, waiting 300.0 seconds 2 items cleaning up... Cleanup took 0.0012843608856201172 seconds Traceback (most recent call last): File "driver/amlbi_main.py", line 275, in main() File "driver/amlbi_main.py", line 226, in main sys.exit(exitcode_candidate) SystemExit: 42 ` – JMTaverne Jul 06 '23 at 05:11

1 Answers1

0

I have tried your code in my environment with different dataset, and it worked successfully.

enter image description here

enter image description here

Prediction

enter image description here

So, configuration for batch inference will be wrong or not appropriate for your data, check that once. And also check the detailed logs in output+logs -> user -> stdout -> 0 -> process000.std.txt.

JayashankarGS
  • 1,501
  • 2
  • 2
  • 6