I'm sorry, I've just learned about the neural network and I have not yet understood in its implementation. Suppose I want to make a back propagation neural network that accepts multiple real numbers as input and produces two types of output, which is a real number, and the other is a choice that is between A, B, and C or only the choice between 0 and 1. What activation function should I use or how do I structure and compute it?
Asked
Active
Viewed 99 times
-1
-
I'm not quite sure I understand your neural network setup. Do you want to have one output node that always produces a real number and then another that optionally produces A/B/C instead of a value between [0,1]? – Alejandro Jun 22 '15 at 03:06
-
Yes, like that. But I mention two cases, the second output optionally produces A/B/C or the second output optionally produces 0/1. – Yudh Jun 23 '15 at 16:13
-
Is this possible if I use only backpropagation? Or should I use different kind of learning algorithm for different kind of output I need? – Yudh Jun 23 '15 at 16:27
-
Is the decision to make the 2nd output node be A/B/C based upon a numerical value? – Alejandro Jun 23 '15 at 16:34
-
In the case where the 2nd output node be A/B/C based upon a numerical value, I think I can use the sigmoid transfer function and scale the result to the range of that numerical value. But, in my case that I want to know is the option decision on 2nd output node isn't based on a numerical value. – Yudh Jun 25 '15 at 02:13
1 Answers
0
The activation function depends on the values of the input and output signals. Here http://www.mathworks.com/help/nnet/ug/multilayer-neural-network-architecture.html are some example of transfer functions. As I understood, all your input and output values are positive numbers, so pureline or logsig functions are maybe the most suitable for your problem. When you form your input and output matrix be careful with sorting input and output values (first row in input matrix correspond to the first row in output matrix). Hope this help you.

New In Programming
- 90
- 1
- 9
-
That activation functions produce real value. In my case, I want the first output produces a real value and the second output optionally produces A/B/C. Or the second value just optionally produces 0/1. Is this possible with back propagation? – Yudh Jun 23 '15 at 16:24
-
It is possible. If your values are real numbers, I think you can use tansig transfer function for first layer, but you have to scale all values between -1 and 1, or even better between -0.9 and 0.9. You can see how to do that here: http://www.mathworks.com/matlabcentral/answers/154075-how-to-scale-normalize-values-in-a-matrix-to-be-between-1-and-1. – New In Programming Jun 24 '15 at 07:01
-
Also if A,B,C are all positive values but the dependence on input values is nonlinear you should use logsig transfer function in second layer, if the dependence on input values is linear you can use pureline transfer function. In the case of output 0 and 1 you can use logsig function as transfer function in scand layer. If you are new in neural networks start with MLP (Multi Layer Perception) type of network first. – New In Programming Jun 24 '15 at 07:06
-
-
How about if A/B/C is not a number and decision to chose those optional value is not based upon a numerical value? Is it possible? – Yudh Jun 25 '15 at 02:52
-
Can you give example of A or B or C type you want to use? Is that classification? – New In Programming Jun 25 '15 at 08:25
-