dense

tfsnippet.layers.dense(*args, **kwargs)

Fully-connected layer.

Roughly speaking, the dense layer is defined as:

output = activation_fn(
    normalizer_fn(tf.matmul(input, weight_norm_fn(kernel)) + bias))
Parameters:
  • input (Tensor) – The input tensor, at least 2-d.
  • units (int) – Number of output units.
  • activation_fn – The activation function.
  • normalizer_fn – The normalizer function.
  • weight_norm (bool or (tf.Tensor) -> tf.Tensor)) –

    If True, apply weight_norm() on kernel. use_scale will be True if normalizer_fn is not specified, and False otherwise. The axis reduction will be determined by the layer.

    If it is a callable function, then it will be used to normalize the kernel instead of weight_norm(). The user must ensure the axis reduction is correct by themselves.

  • gated (bool) – Whether or not to use gate on output? output = activation_fn(output) * sigmoid(gate).
  • gate_sigmoid_bias (Tensor) – The bias added to gate before applying the sigmoid activation.
  • kernel (Tensor) – Instead of creating a new variable, use this tensor.
  • kernel_initializer – The initializer for kernel. Would be default_kernel_initializer(...) if not specified.
  • kernel_regularizer – The regularizer for kernel.
  • kernel_constraint – The constraint for kernel.
  • use_bias (bool or None) – Whether or not to use bias? If True, will always use bias. If None, will use bias only if normalizer_fn is not given. If False, will never use bias. Default is None.
  • bias (Tensor) – Instead of creating a new variable, use this tensor.
  • bias_initializer – The initializer for bias.
  • bias_regularizer – The regularizer for bias.
  • bias_constraint – The constraint for bias.
  • trainable (bool) – Whether or not the variables are trainable?
  • name (str) – Default name of the variable scope. Will be uniquified. If not specified, generate one according to the class name.
  • scope (str) – The name of the variable scope.
Returns:

The output tensor.

Return type:

tf.Tensor