Skip to content Skip to sidebar Skip to footer

What Is The Default Variable Initializer In Tensorflow?

What is the default method of variable initialization used when tf.get_variable() is called without any specification for the initializer? The Docs just says 'None'.

Solution 1:

From the documentation:

If initializer is None (the default), the default initializer passed in the variable scope will be used. If that one is None too, a glorot_uniform_initializer will be used.

The glorot_uniform_initializer function initializes values from a uniform distribution.

This function is documented as:

The Glorot uniform initializer, also called Xavier uniform initializer.

It draws samples from a uniform distribution within [-limit, limit], where limit is sqrt(6 / (fan_in + fan_out)) where fan_in is the number of input units in the weight tensor and fan_out is the number of output units in the weight tensor.

Reference: http://jmlr.org/proceedings/papers/v9/glorot10a/glorot10a.pdf

Post a Comment for "What Is The Default Variable Initializer In Tensorflow?"