Module burn::nn::transformer
Expand description
Transformer module
Structsยง
- Applies the position-wise feed-forward network to the input tensor from the paper Attention Is All You Need.
- Configuration to create a position-wise feed-forward layer using the init function.
- The record type for the module.
- The record item type for the module.
- The transformer decoder module as describe in the paper Attention Is All You Need.
- Autoregressive cache for the Transformer Decoder layer.
- Configuration to create a Transformer Decoder layer using the init function.
- Transformer Decoder forward pass input argument.
- Transformer Decoder layer module.
- The record type for the module.
- The record item type for the module.
- The record type for the module.
- The record item type for the module.
- The transformer encoder module as describe in the paper Attention Is All You Need.
- Autoregressive cache for the Transformer Encoder layer.
- Configuration to create a Transformer Encoder layer using the init function.
- Transformer Encoder forward pass input argument.
- Transformer encoder layer module.
- The record type for the module.
- The record item type for the module.
- The record type for the module.
- The record item type for the module.