I have found that model.layers[index].output
prints the info I need. But, I couldn't get what activation function was used by looking at this output:
Tensor("dense_11_1/clip_by_value:0", shape=(?, 256), dtype=float32)
Usually, it is like Tensor("block5_conv3_1/Relu:0", shape=(?, ?, ?, 512), dtype=float32)
and I can see Relu was used in that layer.
How to determine the activation function of the above-look output? Thanks.