Skip to content

Module 3: Activation Functions (The On/Off Switch)

📚 Module 3: Activation Functions

Course ID: DL-403
Subject: The Logic Switches

Activation functions are the “On/Off Switches” of a Neural Network. Without them, a network is just a calculator doing simple addition.


🏗️ Step 1: ReLU (The “Zero or Hero” Switch)

  • Input is negative? Output is 0.
  • Input is positive? Output is the input.
  • Use: Best for Hidden Layers.

🏗️ Step 2: Sigmoid (The “Probability” Switch)

  • Squashes any input into a value between 0 and 1.
  • Use: Best for Binary Output.

🏗️ Step 3: Softmax (The “Team Player”)

  • Makes a group of scores add up to 100% (1.0).
  • Use: Best for Multi-class Output (Cat, Dog, Bird).

🥅 Module 3 Review

  1. Activation Function: Decides whether a neuron “fires.”
  2. ReLU: Filters noise (negatives).
  3. Sigmoid: Turns scores into percentages.
  4. Softmax: Handles multiple categories.

:::tip Slow Learner Note Think of ReLU as a “Filter.” It only lets the important signals pass through to the next layer! :::