Regression
Predict continuous numeric values from tabular feature columns
Regression models predict a numeric output for each input row. At inference time, provide tabular data with the same feature columns used during training — the model returns a continuous predicted value.
Available Models
- Linear Regression – Baseline linear model, fast and interpretable
- Ridge Regression – Linear regression with L2 regularization
- Lasso Regression – Linear regression with L1 regularization and feature selection
- ElasticNet Regression – Combined L1 + L2 regularization
- Polynomial Regression – Linear regression on polynomial feature expansions
- Decision Tree – Single tree-based regressor
- Random Forest – Ensemble of decision trees for robust predictions
- Support Vector Regression – SVR with configurable kernel functions
- K-Nearest Neighbors – Predict by averaging similar training examples
- Extra Trees – Randomized ensemble regressor
- AdaBoost – Boosting ensemble for regression
- Gradient Boosting – Sequential boosting with high accuracy
- XGBoost – Optimized gradient boosting
- LightGBM – Fast gradient boosting with leaf-wise growth
- CatBoost – Gradient boosting with categorical feature support
- Multi-Layer Perceptron – Neural network for complex regression
- Huber Regression – Robust regression resistant to outliers