Function prelu
pub fn prelu<const D: usize, B>(
tensor: Tensor<B, D>,
alpha: Tensor<B, 1>,
) -> Tensor<B, D>where
B: Backend,Expand description
Applies Parametric ReLu activation function as described in the paper Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification.
- The tensor is assumed to be of shape
[batch_size, channels, ...]. alphais assumed to be of shape[channels]or[1].
PReLu(x) = max(0,x) + alpha * min(0,x)