2

Why periodic functions like sin(x), cos(x) are not used as activation functions in a neural network?

relu = max(0, f(x)) is used

But

f(x) = sin(x) is not used
Dhaval Taunk
  • 1,662
  • 1
  • 9
  • 17
  • 1
    does this answer your question? [Can Sin be used be used as activation in deep learning](https://stats.stackexchange.com/q/402618) – Gaurav Dhiman Mar 08 '20 at 16:11

1 Answers1

2

From my point of view problem is that these functions provides same outputs for many inputs, if neuron gets input 0,1 output will be same as

0,1 + 2kpi

so your neuron will react exactly same for wide range of input values. Plain relu clips negatives values and thats it. Sigmoid or hyperbolic tangents clip the large and small values, but sin or cosine gave you something for 0,1; 0,1 + 2pi; 0,1 + 4pi and something completely different for 0,5; 0,5 + 2pi; 0,5 + 4pi...

Make sense if neuron saturates and for large or low value responds with some constant, but idea that neuron respond to pi/2 and 999 + pi/2 the same seems bad for me, because neurons at next layers will treat these results also as same, but source was completely different (pi/2 vs 999 + pi/2).

viceriel
  • 835
  • 12
  • 18