1

I am running the same code training the same CNN model using the same dataset on GUP and CPU, and I am using k-fold cross validation in my code. The problem is that k-fold seems not working properly on GPU, because on CPU the number of samples that was used for training after the cross validation is about 700 samples in each fold. However, on GPU it is only 27 samples used for training in each fold.

I don't know what is the problem? could someone please help me with this?

AFHG
  • 11
  • 1
  • Can you provide some reproducible code to debug the issue? –  Nov 06 '20 at 12:58
  • Actually, I am using this code: tf.random.set_seed(1234) but I am not sure if I am using it in the right way or not ?! – AFHG Nov 07 '20 at 16:21
  • It would be difficult to tell like that, I would need some minimum code to understand if the mistake is in the code or the possibility of a bug in any of the library. –  Nov 09 '20 at 16:41
  • I don't think there is a mistake in the code because I don't get this issue at all when I run the same code with the same training set on another laptop. – AFHG Nov 12 '20 at 20:25
  • You can follow this link for use of K fold https://machinelearningmastery.com/evaluate-performance-deep-learning-models-keras/ –  Nov 13 '20 at 12:19

0 Answers0