Posts

Showing posts from January, 2026

Gradient Descent

Gradient Descent  Gradient Descent is an optimization algorithm used to find the minimum of a function . In machine learning, we use it to minimize the loss (error) of a model by adjusting its parameters. Intuition (Mountain analogy) Imagine you are standing on a foggy mountain and want to reach the lowest point (valley) . You can’t see the whole mountain You only know the slope at your current position So you: Check the slope Take a small step downhill Repeat until you reach the bottom That’s Gradient Descent . Mountain height → Loss function Your position → Model parameters Slope → Gradient Step size → Learning rate Why do we need Gradient Descent? Most ML models learn by minimizing a loss function :             Loss    = f(θ) Where: θ \theta  = model parameters (weights, bias) Goal: find θ that minimizes loss For complex models: No closed-form solution Too expensive to try a...