0

I have a problem, when training a U-Net, which has many similarities with a CNN, in Keras with Tensorflow. When starting the Training, the Accuracy increases and the loss steadily goes down. At around epoch 40, in my example, the validation loss jumps to the maximum and the validation accuracy to zero. What can I do, to prevent that from happening. I am using a similar approach to this one, for my code, in Keras.

Example image of the Loss

Edit: I already tried changing Learning rate, adding dropout and changing optimzers, those will not change the curve for the better. As i have a big training set, it is very unlikely, that I am encountering overfitting.

Jacob
  • 1
  • 1
  • Welcome to StackOverflow. To be able to help you we need to see what you have tried so far. Could you post the code? Please check [How to ask](https://stackoverflow.com/help/how-to-ask) and [How to create a Minimal, Reproducible Example](https://stackoverflow.com/help/minimal-reproducible-example). – Francisca Concha-Ramírez Mar 10 '20 at 13:24
  • This is a typical example of overfitting, you could try: a) reducing the learning rate b) Use a different descent function like gradient descent with momentum or ada grad to reduce the risk c) Use some other technique, like drop out, random variations in weight, etc. There is lots to find about this online. – mousetail Mar 10 '20 at 13:27
  • You are using a generator to fit. Are you training on the entire dataset in each epoch or is it possible that you encounter vastly different data around epoch 40 that causes your model to accuracy to decrease? If so, that might be expected and you should train on batch over several iterations of the entire dataset, expecting the accuracy and loss to fluctuate while steadily increasing on average. – csteel Mar 17 '20 at 01:51

0 Answers0