-4

Are there any plans to implement a leaky ReLU in the Deep Learning module of H2O? I am a beginner to neural nets, but in the limited amount of model building and parameter tuning, I have found the ReLUs to generalize better, and was wondering if even better performance might be obtained by using leaky ReLUs to avoid the dying ReLU problem.

bio.rf
  • 41
  • 2
  • 1
    This is not a coding question and doesn't belong on Stack Overflow. If you have questions about the H2O roadmap, you can send an email to https://groups.google.com/forum/#!forum/h2ostream – Erin LeDell Sep 12 '17 at 17:45
  • Thank you for pointing out the google group – bio.rf Sep 12 '17 at 20:51

1 Answers1

1

This is not a direct answer to your question because product roadmap is not really something we can comment on. However, if you are worried about dying ReLU problem in H2O, why don't you use ExpRectifier, which stands for exponential linear unit (RLU), which does not suffer dying ReLU problem. As a matter of fact, this paper proves that ELU outperforms all ReLU variants. The only drawback is it is more computational heavy as it involves exponent in calculation.

Lan
  • 6,470
  • 3
  • 26
  • 37
  • Thank you for the pointer regarding the use of ExpRectifier. In the Flow UI, I do not see an option for ExpRectifier. I only see Linear, Tanh and MaxOut, alongwith their dropout counterparts. I am using version 3.14.0.2 installed via R. – bio.rf Sep 13 '17 at 22:59