-1

i read the documintation for keras, and i found that when we ignore the activation function it will be just a simple linear function

activation: Activation function to use. If you don't specify anything, no activation is applied (ie. "linear" activation: a(x) = x).

but also by default the bias is True, and I try this example, and it solved my problem and gave me the correct weight:

enter image description here

So what actually the default activation function here, and how can I detect it?

Thanks

Frightera
  • 4,773
  • 2
  • 13
  • 28
Osama Mohammed
  • 2,433
  • 13
  • 29
  • 61
  • It is a linear layer, also why do you think it can not solved by a linear dense? – Frightera Apr 21 '22 at 17:31
  • i know it is linear, but it is just f(x) = x or f(x) = sum(x * w + b)>>>> because it is look like this>>> – Osama Mohammed Apr 21 '22 at 17:37
  • 1
    Documentation says `a(x) = x`, not `f(x) = x`. Because `f(x) = x` is the identity function, your problem here can be modeled as `f(x) = 2x + 10`, it is linear relationship so no non-linear activation is needed to solve it. – Frightera Apr 21 '22 at 17:43

1 Answers1

1

If you don't specify an activation function, the value of a neuron will just be a weighted sum of inputs and biases. Applying an activation function happens after the sum is calculated so if you don't specify any, it will simply remain like that.

D3nz13
  • 74
  • 5
  • it means something like this: sum(input * w + b) – Osama Mohammed Apr 21 '22 at 17:57
  • Basically, yes. If you don't use any activation it will be like that. If you use, let's say, sigmoid, then it will be sigmoid(sum(input*w + b)). The documentation part a(x) = x simply means the result before and after activation is the same (so basically there is no activation function applied). – D3nz13 Apr 21 '22 at 18:01
  • but this sum(x * w + b) also will change the weight ? why it is not consider as activation? – Osama Mohammed Apr 21 '22 at 18:04