Ridge Regression
Linear regression with L2 regularization to prevent overfitting and handle multicollinearity
Linear regression with L2 regularization to prevent overfitting and handle multicollinearity.
When to use:
- Have correlated features (multicollinearity)
- More features than samples
- Want to prevent overfitting
- Linear relationships but need regularization
Strengths: Handles multicollinearity well, prevents overfitting, all features kept (none zeroed out) Weaknesses: Doesn't perform feature selection, still assumes linearity
Model Parameters
Alpha (default: 1.0) Regularization strength. Higher values = stronger regularization = simpler model.
- Low (0.01-0.1): Weak regularization, close to Linear Regression
- Default (1.0): Balanced
- High (10-100): Strong regularization, very simple model
Fit Intercept (default: true) Whether to calculate intercept term.
Solver
- auto: Choose automatically (default)
- svd: Singular value decomposition (most stable)
- cholesky: Fast for many features
- lsqr: Good for large problems
- saga: Fast for large datasets