I'm trying to implement a boosting model using Tensorflow "BoostedTreesRegressor".
For that, I need to implement a custom loss function where during training, the loss will be calculated according to the logic defined in my custom function rather than using the usual mean_squared_error.
I read in articles that this can be implemented using the interface, "BoostedTreesEstimator" by specifying a head. So, I tried to implement my model as follows:
#define custom loss function to calculate smape
def custom_loss_fn(labels, logits):
return (np.abs(logits - labels) / (np.abs(logits) + np.abs(labels))) * 2
#create input functions
def make_input_fn(X, y, n_epochs=None, shuffle=True):
def input_fn():
dataset = tf.data.Dataset.from_tensor_slices((dict(X), y))
if shuffle:
dataset = dataset.shuffle(NUM_EXAMPLES)
dataset = dataset.repeat(n_epochs)
dataset = dataset.batch(NUM_EXAMPLES)
return dataset
return input_fn
train_input_fn = make_input_fn(dftrain, y_train)
eval_input_fn = make_input_fn(dfeval, y_eval, n_epochs=1, shuffle=False)
my_head = tf.estimator.RegressionHead(loss_fn=custom_loss_fn)
#Training a boosted trees model
est = tf.estimator.BoostedTreesEstimator(feature_columns,
head=my_head,
n_batches_per_layer=1,
n_trees=90,
max_depth=2)
est.train(train_input_fn, max_steps=100)
predictions = list(est.predict(eval_input_fn))
This code provided an error as follows:
'Subclasses of Head must implement create_estimator_spec()
or 'NotImplementedError: Subclasses of Head must implement create_estimator_spec()
or _create_tpu_estimator_spec().
As I read in articles, create_estimator_spec() is used when we define a model_fn() when creating a new Estimator. Here, I do not want to create any new models or Estimators, I only want to use a custom loss function (instead of default mean squared error) when training where the training model should be equal to BoostedTreesRegressor/BoostingTreesEstimator.
It is a great help if anybody can give me some hint to implement this model.