1

To fit a classification model in R, have been using library(KerasR). To control learning rate and KerasR says

compile(optimizer=Adam(lr = 0.001, beta_1 = 0.9, beta_2 = 0.999, epsilon = 1e-08, decay = 0, clipnorm = -1, clipvalue = -1), loss      = 'binary_crossentropy', metrics   =  c('categorical_accuracy') )

But it is given me an error like this

Error in modules$keras.optimizers$Adam(lr = lr, beta_1 = beta_2, beta_2 = beta_2, : attempt to apply non-function

I also used keras_compile still getting the same error. I can change optimizer in compile but the largest learning rate is 0.01, I want to try 0.2.

model <- keras_model_sequential()

model %>% layer_dense(units = 512, activation = 'relu',  input_shape =  ncol(X_train)) %>% 
  layer_dropout(rate = 0.2) %>% 
  layer_dense(units = 128, activation = 'relu')%>%
  layer_dropout(rate = 0.1) %>% 
  layer_dense(units = 2, activation = 'sigmoid')%>%
compile( 
  optimizer = 'Adam', 
  loss      = 'binary_crossentropy',
  metrics   =  c('categorical_accuracy') 
)
iHermes
  • 314
  • 3
  • 12

1 Answers1

2

I think the issue is you are using two different libraries kerasR and keras together. You should use only one of them. First, you are using keras_model_sequential function which is from keras and then you try to use Adam function which is from kerasR library. You find the difference between these two libraries here: https://www.datacamp.com/community/tutorials/keras-r-deep-learning#differences

The following code is working for me which is using only keras library.

library(keras)
model <- keras_model_sequential()

model %>% 
  layer_dense(units = 512, activation = 'relu',  input_shape =  ncol(X_train)) %>% 
  layer_dropout(rate = 0.2) %>% 
  layer_dense(units = 128, activation = 'relu')%>%
  layer_dropout(rate = 0.1) %>% 
  layer_dense(units = 2, activation = 'sigmoid')%>%
  compile(optimizer=optimizer_adam(lr = 0.2), loss= 'binary_crossentropy', metrics   =  c('accuracy') )

Reza
  • 1,945
  • 1
  • 9
  • 17
  • Is there a way to control imbalance data? Negative/Positive=4.3. In somewhere it says bias_initilizer=log(Negative/Positive) but I cannot assign a number to bias. – iHermes Aug 30 '20 at 03:16
  • I am not sure how imbalance is related to initialization? is this for the last layer? – Reza Aug 30 '20 at 04:31
  • Not exactly, I was looking for a way to deal with imbalanced data problem and a post mentioned initialization. – iHermes Aug 30 '20 at 04:57
  • 1
    if you want to change the bias initialize of the last layer: `layer_dense(units = 2, activation = 'sigmoid', bias_initializer = initializer_constant(log(Negative/Positive)))` – Reza Aug 30 '20 at 07:38
  • Can you take a look at this? https://stackoverflow.com/questions/63696288/stabilize-neural-network-prediction-for-class-probability – iHermes Sep 02 '20 at 00:33