Function burn::tensor::activation::prelu
pub fn prelu<const D: usize, B>(
tensor: Tensor<B, D>,
alpha: Tensor<B, 1>,
) -> Tensor<B, D>where
B: Backend,
Expand description
Applies Parametric ReLu activation function as described in the paper Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification.
PReLu(x) = max(0,x) + \alpha * min(0,x)
tensor is assumed to be of shape [batch_size, channels, …]
alpha is assumed to be of shape [channels] or [1]