Module activation
Expand description
§Activation Layers
Users who desire a selectable activation function should
consider Activation, which provides an abstraction over:
The activation layer GLU has shape-changing behaviors
not compatible with the common API, and is not included
in the abstraction wrappers.
Structs§
- GLU
- Applies the gated linear unit function.
- Gelu
- Applies the Gaussian Error Linear Units function element-wise. See also gelu
- Hard
Sigmoid - Hard Sigmoid layer.
- Hard
Sigmoid Config - Configuration to create a Hard Sigmoid layer using the init function.
- Leaky
Relu - Leaky ReLu layer.
- Leaky
Relu Config - Configuration to create a Leaky Relu layer using the init function.
- PRelu
- Parametric Relu layer.
- PRelu
Config - Configuration to create a Parametric Relu layer using the init function.
- PRelu
Record - The record type for the module.
- PRelu
Record Item - The record item type for the module.
- Relu
- Applies the rectified linear unit function element-wise See also relu
- Sigmoid
- Applies the sigmoid function element-wise See also sigmoid
- SwiGlu
- Applies the SwiGLU or Swish Gated Linear Unit to the input tensor.
The SwiGLU activation function is defined as:
SwiGLU(x) = Swish(W_inner * x + b_inner) * (W_outer * x + b_outer) - SwiGlu
Config - Configuration to create a SwiGlu activation layer using the init function.
- SwiGlu
Record - The record type for the module.
- SwiGlu
Record Item - The record item type for the module.
- Tanh
- Applies the tanh activation function element-wise See also tanh
Enums§
- Activation
- Activation Layer Wrapper.
- Activation
Config ActivationConfiguration.- Activation
Record - The record type for the module.
- Activation
Record Item - The record item type for the module.