Comprehensive Guide to Constructing Piecewise Linear Functions Using Hidden Units with ReLU Activations

Subin Alex
2 min readAug 3, 2024

--

Overview

This guide summarizes the concepts and problem-solving steps involved in constructing piecewise linear functions using hidden units with ReLU activations. It includes detailed thought processes, mathematical formulations and key insights.

Key Concepts

1. Piecewise Linear Functions:
— Functions composed of multiple linear segments.
— Each segment is defined over a specific interval and has its own slope.
2. ReLU Activation:
— Rectified Linear Unit (ReLU) is defined as a(x) = max(0, x).
— It outputs zero for negative inputs and increases linearly for non-negative inputs.
3. Hidden Units:
— Used in neural networks to transform input data.
— Each hidden unit applies a ReLU activation to a linear combination of inputs.

Creating a Function with Four Linear Regions

Given Information

- Slopes: [1, 1, -1]
— Joints: [1/6, 2/6, 4/6]

Solution Steps

1. Define Hidden Unit Activations:
— h_1(x) = max(0, x — 1/6)
— h_2(x) = max(0, x — 2/6)
— h_3(x) = max(0, 4/6 — x)
2. Combine Activations:
— The final function is a linear combination of hidden unit activations:
f(x) = φ_0 + φ_1 h_1(x) + φ_2 h_2(x) + φ_3 h_3(x)
3. Determine Coefficients:
— Solve for φ_0, φ_1, φ_2, φ_3 to ensure the function oscillates correctly.

Creating Functions with Three and Five Linear Regions

Part 1: Three Linear Regions Using Two Hidden Units

1. Define Hidden Unit Activations:
— h_1(x) = max(0, x — p)
— h_2(x) = max(0, x — q)
2. Combine Activations:
f(x) = φ_0 + φ_1 h_1(x) + φ_2 h_2(x)
3. Intervals and Behavior:
— For x < p:
f(x) = φ_0
Both h_1(x) and h_2(x) are zero because x is less than p and q.
— For p ≤ x < q:
f(x) = φ_0 + φ_1 (x — p)
Here, h_1(x) is active and h_2(x) is still zero.
— For x ≥ q:
f(x) = φ_0 + φ_1 (x — p) + φ_2 (x — q)
Here, both h_1(x) and h_2(x) are active.
4. Set φ_1 to Ensure Correct Slope:
φ_1 = 1 / (q — p)
This choice ensures that the function f(x) will increase linearly from 0 to 1 as x goes from p to q.
5. Graphical Representation:
— Visualize the function to confirm the behavior.

Part 2: Five Linear Regions Using Four Hidden Units

1. Define More Hidden Units:
— Additional hidden units for more joints.
2. Generalize the Process:
— Similar steps to define and combine activations.

Correcting for True Oscillatory Behavior

Insight: Resetting Activations

1. Adjust Piecewise Definitions:
— Ensure hidden unit activations reset at their respective points.
2. Define Correct Piecewise Functions:
— h_1(x) =
— 0 if x < p
— x — p if p ≤ x < q
— 0 if x ≥ q
— h_2(x) =
— 0 if x < q
— x — q if x ≥ q
3. Combine Correctly:
f(x) = φ_0 + φ_1 h_1(x) + φ_2 h_2(x)

--

--

No responses yet