The keyword arguments used for passing initializers to layers will depend on the layer. Usually it is simply and bias_initializer:

The following built-in initializers are available as part of the keras.initializers module:

  1. keras.initializers.Initializer()

Initializer base class: all initializers inherit from this class.

[source]

Zeros

  1. keras.initializers.Zeros()

Initializer that generates tensors initialized to 0.

Ones

  1. keras.initializers.Ones()

Initializer that generates tensors initialized to 1.

[source]

Constant

  1. keras.initializers.Constant(value=0)

Initializer that generates tensors initialized to a constant value.

Arguments

  • value: float; the value of the generator tensors.

RandomNormal

    Initializer that generates tensors with a normal distribution.

    Arguments

    • mean: a python scalar or a scalar tensor. Mean of the random values to generate.
    • stddev: a python scalar or a scalar tensor. Standard deviation of the random values to generate.
    • seed: A Python integer. Used to seed the random generator.

    [source]

    Initializer that generates tensors with a uniform distribution.

    Arguments

    • minval: A python scalar or a scalar tensor. Lower bound of the range of random values to generate.
    • maxval: A python scalar or a scalar tensor. Upper bound of the range of random values to generate. Defaults to 1 for float types.
    • seed: A Python integer. Used to seed the random generator.

    TruncatedNormal

    1. keras.initializers.TruncatedNormal(mean=0.0, stddev=0.05, seed=None)

    Initializer that generates a truncated normal distribution.

    These values are similar to values from a RandomNormalexcept that values more than two standard deviations from the meanare discarded and redrawn. This is the recommended initializer forneural network weights and filters.

    Arguments

    • mean: a python scalar or a scalar tensor. Mean of the random values to generate.
    • stddev: a python scalar or a scalar tensor. Standard deviation of the random values to generate.
    • seed: A Python integer. Used to seed the random generator.

    [source]

    VarianceScaling

    1. keras.initializers.VarianceScaling(scale=1.0, mode='fan_in', distribution='normal', seed=None)

    Initializer capable of adapting its scale to the shape of weights.

    • number of input units in the weight tensor, if mode = "fan_in"
    • number of output units, if mode = "fan_out"
    • average of the numbers of input and output units, if mode = "fan_avg"

    With distribution="uniform",samples are drawn from a uniform distributionwithin [-limit, limit], with limit = sqrt(3 * scale / n).

    Arguments

    • scale: Scaling factor (positive float).
    • mode: One of "fan_in", "fan_out", "fan_avg".
    • distribution: Random distribution to use. One of "normal", "uniform".
    • seed: A Python integer. Used to seed the random generator.

    Raises

    • ValueError: In case of an invalid value for the "scale", mode" or "distribution" arguments.

    Orthogonal

    Initializer that generates a random orthogonal matrix.

    Arguments

    • gain: Multiplicative factor to apply to the orthogonal matrix.
    • seed: A Python integer. Used to seed the random generator.

    References

    Identity

      Initializer that generates the identity matrix.

      Only use for 2D matrices.If the desired matrix is not square, it gets paddedwith zeros for the additional rows/columns.

      Arguments

      • gain: Multiplicative factor to apply to the identity matrix.
      1. keras.initializers.lecun_uniform(seed=None)

      LeCun uniform initializer.

      It draws samples from a uniform distribution within [-limit, limit]where limit is sqrt(3 / fan_in)where fan_in is the number of input units in the weight tensor.

      Arguments

      • seed: A Python integer. Used to seed the random generator.

      Returns

      An initializer.

      References

      glorot_normal

      Glorot normal initializer, also called Xavier normal initializer.

      It draws samples from a truncated normal distribution centered on 0with stddev = sqrt(2 / (fan_in + fan_out))where fan_in is the number of input units in the weight tensorand fan_out is the number of output units in the weight tensor.

      Arguments

      • seed: A Python integer. Used to seed the random generator.

      Returns

      An initializer.

      References

      glorot_uniform

      1. keras.initializers.glorot_uniform(seed=None)

      It draws samples from a uniform distribution within [-limit, limit]where limit is sqrt(6 / (fan_in + fan_out))where fan_in is the number of input units in the weight tensorand fan_out is the number of output units in the weight tensor.

      Arguments

      • seed: A Python integer. Used to seed the random generator.

      Returns

      An initializer.

      References

      he_normal

      1. keras.initializers.he_normal(seed=None)

      He normal initializer.

      It draws samples from a truncated normal distribution centered on 0with where fan_in is the number of input units in the weight tensor.

      Arguments

      • seed: A Python integer. Used to seed the random generator.

      Returns

      An initializer.

      References

      lecun_normal

      1. keras.initializers.lecun_normal(seed=None)

      LeCun normal initializer.

      It draws samples from a truncated normal distribution centered on 0with stddev = sqrt(1 / fan_in)where fan_in is the number of input units in the weight tensor.

      Arguments

      • seed: A Python integer. Used to seed the random generator.

      Returns

      An initializer.

      References

      1. keras.initializers.he_uniform(seed=None)

      He uniform variance scaling initializer.

      It draws samples from a uniform distribution within [-limit, limit]where limit is sqrt(6 / fan_in)where fan_in is the number of input units in the weight tensor.

      Arguments

      • seed: A Python integer. Used to seed the random generator.

      Returns

      An initializer.

      References

      1. from keras import initializers
      2. model.add(Dense(64, kernel_initializer=initializers.random_normal(stddev=0.01)))
      3. model.add(Dense(64, kernel_initializer='random_normal'))

      If passing a custom callable, then it must take the argument (shape of the variable to initialize) and dtype (dtype of generated values):