2

I have to evaluate my CNN model for Semantic Segmentation (when predicting) in terms of computational complexity, memory use and processing time. I work with Python (Keras).

Is there a way to count the number of operations performed in each layer or at least in the whole model?

I was taking a look to profiling libraries such as CProfile, but honestly, I am not sure of what I am getting.

Any help, please?

Thanks in advance!

AlvaroNav
  • 65
  • 1
  • 7
  • What do you mean by _computational complexity_, exactly? _memory use and processing time_ are simply a matter of benchmarking and profiling, there are already plenty of resources on the subject. – AMC Feb 11 '20 at 20:38
  • Hi! Thank you for your comment! I might be using the wrong terminology, What I want is to estimate the number of basic operations (addition, multiplication, variable assignation...) performed by my CNN model in the prediction stage. I think this is usually expressed by big-O notation. – AlvaroNav Feb 12 '20 at 08:34

0 Answers0