Function burn::tensor::activation::relu
pub fn relu<const D: usize, B>(tensor: Tensor<B, D>) -> Tensor<B, D>where
B: Backend,
Expand description
Applies the rectified linear unit function as described in the paper Deep Learning using Rectified Linear Units (ReLU).
y = max(0, x)