0

I see the following explanation of xavier_initializer(). It says var(wi) = 1/Navg which take the number of input neurons during practical implementation.

https://prateekvjoshi.com/2016/03/29/understanding-xavier-initialization-in-deep-neural-networks/

However, in the following example, there are no neurons. I calculated the variance of W. Does anybody know its variance is determined according to xavier_initializer()? Thanks!

$ cat main.py
#!/usr/bin/env python
# vim: set noexpandtab tabstop=2 shiftwidth=2 softtabstop=-1 fileencoding=utf-8:

import tensorflow as tf
W = tf.get_variable("W", shape=[5], initializer=tf.contrib.layers.xavier_initializer())
init = tf.global_variables_initializer()
import numpy
with tf.Session() as sess:
    sess.run(init)
    print numpy.var(W.eval())
$ ./main.py 
0.166031
aL_eX
  • 1,453
  • 2
  • 15
  • 30
user1424739
  • 11,937
  • 17
  • 63
  • 152

1 Answers1

0

If i modify your code like this, do you still have problem?

import numpy as np
import tensorflow as tf

W = tf.get_variable("W", shape=[5],initializer=tf.contrib.layers.xavier_initializer())
init = tf.global_variables_initializer()

with tf.Session() as sess:
    sess.run(init)
    w_value = sess.run(W)
    print w_value
    print np.var(w_value)
dxf
  • 573
  • 1
  • 3
  • 9