I have a simple model,
public class ModelInput
{
[LoadColumn(0)]
public float Label { get; set; }
[LoadColumn(1, numberOfFeatures)]
[VectorType(numberOfFeatures)]
public float[] Features { get; set; }
}
public class ModelOutput
{
[VectorType(numberOfFeatures)]
public float[] Features { get; set; }
[ColumnName(@"Score")]
public float Score { get; set; }
}
a normalized dataset (all values are between 0 and 1) and a lookup table with a metric for each possible predicted output for each line in the dataset.
0<=x<0.25, 0.25<=x<0.5, 0.5<=x<0.75, 0.75<=x<=1 (predicted output is x)
for the 1st line in the dataset: 0.5,0.6,0.7,0.5
for the 2nd line in the dataset: 0.8,0.6,0.5,0.3
for the 3rd line in the dataset: 0.5,0.6,0.6,0.5
etc
I set up a regression experiment with cross validation
MLContext context = new MLContext();
IDataView data = context.Data.LoadFromTextFile<ModelInput>(dataPath.ToString(), separatorChar: ',', hasHeader: true);
var experimentSettings = new RegressionExperimentSettings();
experimentSettings.MaxExperimentTimeInSeconds = time;
experimentSettings.OptimizingMetric = RegressionMetric.MeanSquaredError;
RegressionExperiment experiment = context.Auto().CreateRegressionExperiment(experimentSettings);
CrossValidationExperimentResult<RegressionMetrics> cvExperimentResult = experiment.Execute(data, 10, "Label");// 10 folds for cross validation
but instead of minimizing the mean squared error like in the code snippet (or using any of the other regression metrics) I need to maximize the mean of the values from the table. For that, I need to define a custom optimizing metric that looks in the table but I can't find anything in the documentation.
If it's impossible to define a custom optimizing metric for AutoML, would it be possible to define a custom loss function for regular ML.NET and recreate the functionality of AutoML? I haven't looked yet into adjusting models without AutoML, but I guess you just try different parameters?