Regularization

Regularization is a crucial technique to prevent overfitting in machine learning models. When models become too complex, they can memorize training data rather than learning generalizable patterns.

In civil engineering applications, regularization helps create more robust models for tasks like predicting structural responses, material behavior, or system performance under varying conditions.
How to Use:

• Adjust the polynomial degree slider to control model complexity
• Use L1 regularization (Lasso) to encourage sparse coefficients
• Use L2 regularization (Ridge) to shrink coefficient magnitudes
• Click anywhere on the plot to add new data points
• Use "Generate New Data" to create 5 random points
• Use "Clear All Data" to remove all points
• Observe how regularization helps prevent overfitting
Mathematical Foundation:

L1 Regularization (Lasso): Adds λ1|βi| to the loss function
• Encourages sparse solutions (some coefficients become exactly zero)
• Useful for feature selection

L2 Regularization (Ridge): Adds λ2βi2 to the loss function
• Shrinks coefficients towards zero but doesn't make them exactly zero
• Helps with multicollinearity

The regularized loss becomes: Loss=MSE+λ1|βi|+λ2βi2
Understanding Regularization Effects:

High polynomial degree: Creates complex curves that may overfit without regularization
L1 regularization: Watch coefficients become exactly zero - automatic feature selection
L2 regularization: Smooths the curve and reduces oscillations
Bias-variance tradeoff: Higher regularization increases bias but reduces variance
Optimal balance: Find the sweet spot between underfitting and overfitting
y=33.454.35x
1
0.0
0.0
Visualization Options
MSE=40.17
MAE=6.00