Module activation

Expand description

§Activation Layers

Users who desire a selectable activation function should consider Activation, which provides an abstraction over:

The activation layer GLU has shape-changing behaviors not compatible with the common API, and is not included in the abstraction wrappers.

Structs§

GLU
Applies the gated linear unit function.
Gelu
Applies the Gaussian Error Linear Units function element-wise. See also gelu
HardSigmoid
Hard Sigmoid layer.
HardSigmoidConfig
Configuration to create a Hard Sigmoid layer using the init function.
LeakyRelu
Leaky ReLu layer.
LeakyReluConfig
Configuration to create a Leaky Relu layer using the init function.
PRelu
Parametric Relu layer.
PReluConfig
Configuration to create a Parametric Relu layer using the init function.
PReluRecord
The record type for the module.
PReluRecordItem
The record item type for the module.
Relu
Applies the rectified linear unit function element-wise See also relu
Sigmoid
Applies the sigmoid function element-wise See also sigmoid
SwiGlu
Applies the SwiGLU or Swish Gated Linear Unit to the input tensor. The SwiGLU activation function is defined as: SwiGLU(x) = Swish(W_inner * x + b_inner) * (W_outer * x + b_outer)
SwiGluConfig
Configuration to create a SwiGlu activation layer using the init function.
SwiGluRecord
The record type for the module.
SwiGluRecordItem
The record item type for the module.
Tanh
Applies the tanh activation function element-wise See also tanh

Enums§

Activation
Activation Layer Wrapper.
ActivationConfig
Activation Configuration.
ActivationRecord
The record type for the module.
ActivationRecordItem
The record item type for the module.