Instructions:
• Initial parameters α (intercept) and β (slope) are set randomly on each dataset load
• Click "Step" to perform exactly one gradient descent iteration
• Watch the regression line evolve on the normalized data plot (left)
• Track the optimization path on the cost surface (right)
• Use "Re-initialize" to generate new random starting parameters
• Choose from sample dataset or civil engineering examples (all data is normalized)
Algorithm (Batch Gradient Descent):
Model: μ(x)=α+βx (on normalized data)
Cost: J(α,β)=12ni=1n(α+βxiyi)2
Update: Simultaneously compute gradients and update both parameters
Learning rate η: Controls step size (default 0.1 works well for normalized data)

Visualization:
Left panel: Normalized data points, current regression line, residuals (dashed red lines)
Right panel: Cost surface contours, current position (red dot), optimization path
Readouts: Current iteration, normalized parameters, cost, and gradient norm
Data & Regression Line
Cost Surface
Ready to start gradient descent. Click "Step" to begin.
Iteration: 0
α: 0.300
β: -0.600
Cost J: 1.325
||J||: 1.627