i read the documintation for keras, and i found that when we ignore the activation function it will be just a simple linear function
activation: Activation function to use. If you don't specify anything, no activation is applied (ie. "linear" activation: a(x) = x).
but also by default the bias
is True
, and I try this example, and it solved my problem and gave me the correct weight:
So what actually the default activation function here, and how can I detect it?
Thanks