What Is The Default Variable Initializer In Tensorflow?
What is the default method of variable initialization used when tf.get_variable() is called without any specification for the initializer? The Docs just says 'None'.
Solution 1:
If initializer is
None(the default), the default initializer passed in the variable scope will be used. If that one isNonetoo, aglorot_uniform_initializerwill be used.
The glorot_uniform_initializer function initializes values from a uniform distribution.
This function is documented as:
The Glorot uniform initializer, also called Xavier uniform initializer.
It draws samples from a uniform distribution within [-limit, limit], where
limitissqrt(6 / (fan_in + fan_out))wherefan_inis the number of input units in the weight tensor andfan_outis the number of output units in the weight tensor.Reference: http://jmlr.org/proceedings/papers/v9/glorot10a/glorot10a.pdf
Post a Comment for "What Is The Default Variable Initializer In Tensorflow?"