I found this difficult to find in the documentation, but found this link that I was able to successfully run a batch execution without uploading a blob. https://learn.microsoft.com/en-us/azure/machine-learning/studio/web-services-that-use-import-export-modules
The key part is:
Copy and paste the C# sample code into your Program.cs file, and
remove all references to the blob storage
Locate the request declaration and update the values of Web Service
Parameters that are passed to the Import Data and Export Data
modules. In this case, you use the original query, but define a new
table name.
To configure the Web Service Parameters for the import query and the destination table:
In the properties pane for the Import Data module, click the icon at the top right of the Database query field and select Set as web service parameter.
In the properties pane for the Export Data module, click the icon at the top right of the Data table name field and select Set as web service parameter.
At the bottom of the Export Data module properties pane, in the Web Service Parameters section, click Database query and rename it Query.
Click Data table name and rename it Table.
var request = new BatchExecutionRequest()
{
GlobalParameters = new Dictionary<string, string>() {
{ "Query", @"select [age], [workclass], [fnlwgt], [education], [education-num], [marital-status], [occupation], [relationship], [race], [sex], [capital-gain], [capital-loss], [hours-per-week], [native-country], [income] from dbo.censusdata" },
{ "Table", "dbo.ScoredTable2" },
}
};
Once you have the database as your load data module source, you need not have a web service input on the training module. You can also set a database query as a web parameter. Once you run the batch execution job to retrain the models, you can store them in Azure blob storage, and have your predictive model load them from there at runtime using "load trained model" modules rather than trained model modules. See this link for that procedure:
https://blogs.technet.microsoft.com/machinelearning/2017/06/19/loading-a-trained-model-dynamically-in-an-azure-ml-web-service/
So in short:
Use your SQL database as the source for your import data module
Run the batch execution process to retrain the model at whatever interval you want
Save the retrained models (ilearner files) to blob storage, or at an http address that is accessable
Use the load trained model module in your predictive experiment rather than trained model.
put the path to your blob or url in the parameters of the load trained model module(s).
run, publish and test the predictive experiment with the dynamically loaded models
Note, this approach can be used if you have multiple models in your experiment.