1

I am new in Azure ML Studio and try to figure out how to use Azure SQL Server as training input parameter. So far I could sync local data into Azure SQL database and train data and get output on Azure ML Studio. I've deployed prediction web service as well. I need to trigger training service to read data from Azure SQL server but all examples around show Azure blob storage as training service input.

  1. Is there a way to use Azure SQL Server as training input parameter and whole training data will be read from that database and the trained model get updated ?
  2. Do I need save trained model to Azure Blob storage to be used in predictive service or it will be updated after training batch job completed and will predictive service start using trained model automatically ?
Erhan A
  • 691
  • 11
  • 21

2 Answers2

0

regarding the training, you will want to use the import data modules. In there you can specify how to connect to your Azure SQL database.

This article might give you a good pointer, it shows how to use Azure Data Factory to synchronize retraining of an experiment.

Ahmet
  • 802
  • 1
  • 5
  • 18
0

I found this difficult to find in the documentation, but found this link that I was able to successfully run a batch execution without uploading a blob. https://learn.microsoft.com/en-us/azure/machine-learning/studio/web-services-that-use-import-export-modules

The key part is:

  • Copy and paste the C# sample code into your Program.cs file, and remove all references to the blob storage

  • Locate the request declaration and update the values of Web Service
    Parameters that are passed to the Import Data and Export Data
    modules. In this case, you use the original query, but define a new
    table name.

To configure the Web Service Parameters for the import query and the destination table:

In the properties pane for the Import Data module, click the icon at the top right of the Database query field and select Set as web service parameter. In the properties pane for the Export Data module, click the icon at the top right of the Data table name field and select Set as web service parameter. At the bottom of the Export Data module properties pane, in the Web Service Parameters section, click Database query and rename it Query. Click Data table name and rename it Table.

 var request = new BatchExecutionRequest() 
 {           
     GlobalParameters = new Dictionary<string, string>() {
         { "Query", @"select [age], [workclass], [fnlwgt], [education], [education-num], [marital-status], [occupation], [relationship], [race], [sex], [capital-gain], [capital-loss], [hours-per-week], [native-country], [income] from dbo.censusdata" },
         { "Table", "dbo.ScoredTable2" },
     }
 };

Once you have the database as your load data module source, you need not have a web service input on the training module. You can also set a database query as a web parameter. Once you run the batch execution job to retrain the models, you can store them in Azure blob storage, and have your predictive model load them from there at runtime using "load trained model" modules rather than trained model modules. See this link for that procedure:

https://blogs.technet.microsoft.com/machinelearning/2017/06/19/loading-a-trained-model-dynamically-in-an-azure-ml-web-service/

So in short:

Use your SQL database as the source for your import data module

Run the batch execution process to retrain the model at whatever interval you want

Save the retrained models (ilearner files) to blob storage, or at an http address that is accessable

Use the load trained model module in your predictive experiment rather than trained model.

put the path to your blob or url in the parameters of the load trained model module(s).

run, publish and test the predictive experiment with the dynamically loaded models

Note, this approach can be used if you have multiple models in your experiment.

Claytronicon
  • 1,437
  • 13
  • 14