Questions tagged [azureml-python-sdk]
188 questions
0
votes
2 answers
Azureml TabularDataset to_pandas_dataframe() returns InvalidEncoding error
When I run:
datasetTabular = Dataset.get_by_name(ws, "")
datasetTabular.to_pandas_dataframe()
The following error is returned. What can I do to get past this?
ExecutionError Traceback (most recent call last)…

Susan
- 1
- 1
0
votes
1 answer
Run script inside Docker container using Azure Machine Learning
Azure Machine Learning provides provides encapsulation of the environment for your code to run. As far as I know you can specify custom Docker images and Dockerfiles to create an environment.
But in my specific use case, I want to run the script…

raja
- 380
- 2
- 9
- 20
0
votes
1 answer
What are valid Azure ML Workspace connection argument options?
I want to build an Azure ML environment with two python packages that I have in Azure Devops.
For this I need a workspace connection to Azure Devops. One package is published to an artifact feed and I can access it using the python SDK using a…

Kyllian Broers
- 1
- 1
0
votes
1 answer
Why is AzureML SDK corrupting the default datastore?
I have tried following the documentation instructions here (see my code below), and the pipeline seems to run okay. However, when I view it on ML Studio, it says the pipeline has failed because the container does not exist.
Worse, if I log into…
0
votes
1 answer
azureml.contrib.dataset vs azureml.data
Looks like AzureML Python SDK has two Dataset packages exposed over API:
azureml.contrib.dataset
azureml.data
The documentation doesn't clearly mention the difference or when should we use which one? But, it creates confusion for sure. For…

Arnab Biswas
- 4,495
- 3
- 42
- 60
0
votes
2 answers
AzureML: Dataset Profile fails when parquet file is empty
I have created a Tabular Dataset using Azure ML python API. Data under question is a bunch of parquet files (~10K parquet files each of size of 330 KB) residing in Azure Data Lake Gen 2 spread across multiple partitions. When I trigger "Generate…

Arnab Biswas
- 4,495
- 3
- 42
- 60
0
votes
1 answer
Azure ML Tabular Dataset : missing 1 required positional argument: 'stream_column'
For the Python API for tabular dataset of AzureML (azureml.data.TabularDataset), there are two experimental methods which have been introduced:
download(stream_column, target_path=None, overwrite=False, ignore_not_found=True)
mount(stream_column,…

Arnab Biswas
- 4,495
- 3
- 42
- 60
0
votes
1 answer
AzureMLCompute job failed with `FailedLoginToImageRegistry`
I've been trying to send a train job through azure ml python sdk with:
from azureml.core import Workspace, Experiment, ScriptRunConfig
if __name__ == "__main__":
ws = Workspace.from_config()
experiment = Experiment(workspace=ws,…

nferreira78
- 1,013
- 4
- 17
0
votes
1 answer
How to get the model in scoring file from the one created in Azure AutoML pipeline (Python SDK)?
I've developed a pipeline with AutoML step and used the produced artifact to register the model. The artifact is a serialized model and is a big single file: model_data. I used pickle.load function to deserialize the model in Init function in the…

B K
- 1
0
votes
1 answer
Azure ML Studio Local Environment — Numpy package import failure using the Azure ML Python SDK
I am trying to create a local environment for the ML Studio using the Python SDK, following
this official cheatsheet. The result should be a conda-like environment that can be used for local testing. However, I am running into an error when…

Isak Engström
- 155
- 10
0
votes
1 answer
Azure ML Pipeline fails while running a grid search CV on a cluster
I implemented a gridsearchcv on Azure ML as a pipeline but I keep getting an error that says "User program failed with TerminatedWorkerError: A worker process managed by the executor was unexpectedly terminated. This could be caused by a…

mark_r
- 1
0
votes
1 answer
How to dump and utilize multiple ML algorithm objects in one single pickle file in Azure ML workspace?
I am trying to create a ML model in Azure ML Workspace using Jupyter notebook. I am not using AutoML feature or Designer provided by Azure, and want to run the complete code prepared locally.
There are 3 different algorithms used in my ML Model. I…

Bhawna Pareta
- 1
- 2
0
votes
1 answer
can we use azure data explorer function (for example series_decompose() ) locally or anywhere in python program
There is function in azure data explorer i.e. series_decompose() , so I need to use this function in my python program locally with data from sql
So can I do it, and if yes then how?

vishal
- 209
- 2
- 4
- 14
0
votes
1 answer
Can't get a reference to the compute instance in azureML notebook
I'm working in azureML Notebook and whenever i try to get a reference to the compute instance using the code bellow :
from azureml.core import Workspace
ws = Workspace.from_config()
from azureml.core import ComputeTarget
compute =…

Mountassir EL MOUSTAAID
- 79
- 1
- 4
0
votes
0 answers
Not able to read model.pkl from output folder in Azure ML
I'm try to read the model.pkl file from the artifacts output folder like this
def init():
global model
# infile = open('model.pkl','rb')
# model = pickle.load(infile)
#model = joblib.load('model.pkl')
model_path =…

Sai Varun Kumar
- 107
- 7