1

Is there a way (using just the R programming language) to implement an ANN algorithm with using a custom learning function (instead of backpropagation)? All of the R packages I have tested (nnet, neuralnet, AMORE) seem to have options for learning functions to train the weights, but none of them seem to include the optionality to plug in a custom function (say, hill climbing as an example).

I'd prefer to use R over another language, so if anybody knows of any package that can help, let me know.

Thanks!

jfalkson
  • 3,471
  • 4
  • 20
  • 25
  • Note that `nnet` doesn't use backprop; it uses a quasi-Newton algorithm (basically BFGS) to optimise the weights. – Hong Ooi Mar 03 '14 at 06:12

1 Answers1

0

OBSERVATION:

Hill Climbing is an optimization algorithm which works on neighbours and Backpropogation is a training algorithm. Typically ANN packages use training methods to adjust weights based on error between two outputs:- it does not optimize (or replace weights) based on neighbors. That's why you'll only find options for learning functions to train the weights but none for training via hill climbing or so. This is by design

SOLUTION:

Use custom mathematics by means of input vectors, output vectors, matrix of node cells, in any language to have Hill Climbing ANN or so and iterate weights.

If not convinced for scratch implementation have a look at simple hill climbing in MATLAB. Am sure this will be re-writable in R

SACn
  • 1,862
  • 1
  • 14
  • 29