After having solved Why does my ML model deployment in Azure Container Instance still fail? and having deployed on ACI, I am using Azure Machine Learning Service to deploy a ML model as web service on AKS.
My current (working) ACI-deployment code is
from azureml.core.webservice import Webservice, AciWebservice
from azureml.core.image import ContainerImage
aciconfig = AciWebservice.deploy_configuration(cpu_cores=1,
memory_gb=8,
tags={"data": "text", "method" : "NB"},
description='Predict something')
image_config = ContainerImage.image_configuration(execution_script="score.py",
docker_file="Dockerfile",
runtime="python",
conda_file="myenv.yml")
image = ContainerImage.create(name = "scorer-image",
models = [model],
image_config = image_config,
workspace = ws
)
service_name = 'scorer-svc'
service = Webservice.deploy_from_image(deployment_config = aciconfig,
image = image,
name = service_name,
workspace = ws)
I would like to modify it so to deploy on AKS, but looks more convoluted than I expected, as I imagined moving from ACI to AKS (i.e. from test to production) to be a routine operation. Still, it seems to need a bit more of changes in the code than I thought:
- AKS seems to require an
InferenceConfig
object (?) - with AKS there's no method like
deploy_from_image
for deployment from my existing Dockerimage
(?)
Can deployment be done on AKS by performing minimal changes to the ACI code instead?