Rectified Linear Unit

The rectified linear unit (ReLU) is a mathematical function that outputs a value equal to the input for positive inputs and outputs a 0 otherwise.

ReLU function

We can define ReLU using the following equation:

f ( x ) = max ( 0 , x )

The derivative of the ReLU is 1 for positive inputs, 0 for negative inputs, and undefined for inputs of 0.

f ' ( x ) = { 1 if  x > 0 0 if  x < 0

Implementing ReLU in numpy

import numpy as np

def relu(x):
    return np.maximum(0, x)

print(f"max(0,-20) = {relu(-20)}")
max(0,-20) = 0

This page references the following sources:

Here are all the notes in this garden, along with their links, visualized as a graph.