Module attention
burn
0.20.0
Module attention
Module Items
Functions
In burn::
tensor::
backend::
ops
burn
::
tensor
::
backend
::
ops
Module
attention
Copy item path
Expand description
Module with attention operations.
Functions
§
naive_
attention
Computes softmax(QKᵗ / √d) · V using separate kernels. Serves as a fallback when FlashAttention is not used.