Rectified Linear Unit
The rectified linear unit (ReLU) is a mathematical function that outputs a value equal to the input for positive inputs and outputs a 0 otherwise.
We can define ReLU using the following equation:
The derivative of the ReLU is 1 for positive inputs, 0 for negative inputs, and undefined for inputs of 0.
Implementing ReLU in numpy
import numpy as np
def relu(x):
return np.maximum(0, x)
print(f"max(0,-20) = {relu(-20)}")
max(0,-20) = 0
This page references the following sources: