How to "design a back propagation neural network which can fit the function y = 9x + 3x^ + 8x^3 + 2x^4 + 2
with 1 input, 1 output, 1 hidden layer with 4 neurons."?

- 57,590
- 26
- 140
- 166

- 17
- 7
-
2This is way too broad of a topic. You should read some deep learning textbooks if you want to learn the fundamentals. – Marco Bonelli Jun 12 '22 at 13:19
2 Answers
Neural networks need some data to work on. So, create a dataset with y = f(x)
x y
=======================================
0 2
1 y = 9x1 + 3x1 + 8x1 + 2x1 + 2
2 y = 9x2 + 3x4 + 8x8 + 2x16 + 2
... ...
Then using keras build a sequential model with input, hidden and output layer
Follow keras.io for more details on how to build, compile and train a model on a given dataset
More than that, if you already know f(x) then you don't need any model, you can directly apply it. The main objective of NN or any ML model is to estimate the f(x) for a given set of data points.

- 389
- 3
- 10
Given constraints on the architecture I am expecting you are actually being asked to hard code some activation functions. Note, that a tiny neural net with 4 hidden neurons will never learn a function like this well with just relus/sigmoids.
However, if we give ith hidden unit an activation f_i(x) := x^i, then you will be able to learn this kind of function which your tiny network. You can even guess the weights manually!

- 64,777
- 8
- 131
- 164