Logistic Regression
Linear classifier for binary and multi-class problems
Logistic Regression predicts class probabilities using a linear decision boundary, making it one of the most interpretable and efficient classifiers available.
When to use:
- Baseline classification with interpretable feature weights
- Binary or multi-class problems with linearly separable data
- When you need calibrated probabilities with fast inference
Input: Tabular data with the feature columns defined during training Output: Predicted class label and class probabilities
Model Settings (set during training, used at inference)
Penalty (default: l2)
Regularization norm applied during training. l2 shrinks all coefficients, l1 zeros out irrelevant features, elasticnet combines both, none applies no regularization.
C (default: 1.0) Inverse regularization strength. Lower values apply stronger regularization and reduce overfitting; higher values fit the training data more closely.
Solver (default: lbfgs)
Optimization algorithm. lbfgs is the default for small datasets; saga supports all penalties and scales better to large datasets.
Max Iterations (default: 100) Maximum solver iterations. Increase if the model did not converge during training.
Class Weight (default: null)
Set to balanced to automatically adjust weights inversely proportional to class frequencies — useful for imbalanced datasets.
Inference Settings
No dedicated inference-time settings. Predictions are fully determined by the trained model.