nn.thresholded_relu
Applies the thresholded rectified linear unit (Thresholded ReLU) activation function element-wise to a given tensor.
The Thresholded ReLU function is defined as f(x) = x if x > alpha, f(x) = 0 otherwise, where x is the input element.
Args
tensor
(@Tensor<T>
) - A snapshot of a tensor to which the Leaky ReLU function will be applied.alpha
(@T
) - A snapshot of a fixed point scalar that defines the alpha value of the Thresholded ReLU function.
Returns
A new fixed point tensor with the same shape as the input tensor and the Thresholded ReLU function applied element-wise.
Type Constraints
Constrain input and output types to fixed point tensors.
Examples
Last updated