0

In TensorFlow.js, I created a sequential neural network with 3 dense layers that works when I set the activation function to 'relu', but when I try 'tanh' or 'sigmoid' it throws the error, "Error: Tensor is disposed".

I did a model summary to verify that changing the activation function didn't change the structure of the network, or the parameter numbers. I also tried commenting out the tf.tidy that I was using.

Here is my model:

const myModel = tf.sequential();

myModel.add(tf.layers.dense({ units: 64, inputShape: [1], activation: 'tanh' }));
myModel.add(tf.layers.dense({ units: 64, inputShape: [1], activation: 'relu' }));
myModel.add(tf.layers.dense({ units: 1 }));

Switching the 'tanh' to 'relu' fixes the problem, but I don't know why.

Here is my training code:

optimizer.minimize(() => {
    let inputs = tf.tensor2d(x_vals);
    let predictions = myModel.predictOnBatch(inputs);
    let totalLoss = tf.losses.meanSquaredError(tf.tensor2d(y_vals), predictions);
    return totalLoss;
});

Full Code Snippet (takes a second to run):

x_vals = [
    [1],
    [2],
    [3],
    [4],
    [5]
];

y_vals = [
    [1],
    [2],
    [3],
    [4],
    [5]
];

const optimizer = tf.train.adam(.005);


const myModel = tf.sequential();

myModel.add(tf.layers.dense({ units: 64, inputShape: [1], activation: 'tanh' }));
myModel.add(tf.layers.dense({ units: 64, activation: 'relu' }));
myModel.add(tf.layers.dense({ units: 1 }));

myModel.summary();


optimizer.minimize(() => {
    let inputs = tf.tensor2d(x_vals);
    let predictions = myModel.predictOnBatch(inputs);
    let totalLoss = tf.losses.meanSquaredError(tf.tensor2d(y_vals), predictions);
    return totalLoss;
});


curveY = [];

for (let i = 0; i < x_vals.length; i++) {
    curveY.push(myModel.predict(tf.tensor([
        x_vals[i]
    ])).dataSync());
}


console.log(curveY);
<!DOCTYPE html>
<html>

<head>
    <meta charset="UTF-8">
    <meta http-equiv="X-UA-Compatible" content="IE=edge">
    <meta name="viewport" content="width=device-width, initial-scale=1">

    <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@1.0.0/dist/tf.js">
    </script>

</head>

<body>
</body>

</html>
Brett L
  • 105
  • 1
  • 11

1 Answers1

0

The issue is not related to the activation layer. Most probably that you are re-using a tensor that was already disposed in a tf.tidy callback.

Here is a simple sequential model using tanh and sigmoid activation layers

const model = tf.sequential({
    layers: [
      tf.layers.dense({ units: 64, inputShape: [1], activation: 'tanh' }),
      tf.layers.dense({ units: 64, inputShape: [1], activation: 'sigmoid' }),
      tf.layers.dense({ units: 1 })
    ]
});
model.compile({optimizer: 'sgd', loss: 'meanSquaredError'});
for (let i = 1; i < 5 ; ++i) {
  const h = await model.fit(tf.ones([8, 1]), tf.ones([8, 1]), {
      batchSize: 4,
      epochs: 3
  });
  console.log("Loss after Epoch " + i + " : " + h.history.loss[0]);
}
edkeveked
  • 17,989
  • 10
  • 55
  • 93
  • I removed all `tf.tidy` callbacks and `dispose` calls from my code, but I'm still getting the same issue. Interestingly, the issue goes away when I use `model.fit` instead of `optimizer.minimize`. I provided a full code snippet to run, maybe that will help to clarify. – Brett L Oct 04 '19 at 19:11
  • I wonder if I should post this as a bug on the Tensorflow.js Github page. – Brett L Oct 09 '19 at 23:02