I am currently trying to determine a way to adjust the slope of a sigmoid activation function within a neural network in MATLAB, by some scalar value. For example, as opposed to using tanh(x) I would be using tanh(3x) as my activation function (if this seems poorly motivated, it is in fulfillment of an assignment...) I know that I can write a custom activation function to do this by modifying the apply.m file within the folder '+tansig', but I would like to know whether it's possible to circumvent this step by simply multiplying a scalar value of 3, as an additional 'layer' in the network that comes right before the activation function is applied.
Asked
Active
Viewed 367 times
1 Answers
0
No, you will have to write your own activation function @(x) logsig(3*x)
but you can assign it to the whole layer at once.
The term "layer" refers to a set of neurons that are equally deep in a network. Each input of a single neuron (within one layer) is summed up and the result is fed into the activation function. Therefore, you cannot add an input that multiplies the original inputs before applying the activation function. If you want to shift the input, you can alter the bias term, but still, it is an addition.
Note that a scaling of the activation function is usually not necessary because the learned weights will to this automatically. Nevertheless, if it is a task, just do the quick workaround with the anonymous function handle above.

max
- 3,915
- 2
- 9
- 25