MARK C. answered 12/29/25
Machine Learning & AI Tutor – Models, Math & Practical Applications
Common approaches for tuning hyperparameters in XGBoost and Random Forest:
Methods:
- Grid Search - Try all combinations of specified parameters
- Random Search - Randomly sample parameter combinations
- Bayesian Optimization - Intelligently search based on previous results
- Cross-Validation - Use k-fold CV to evaluate each parameter set
Key hyperparameters to tune:
Random Forest:
- n_estimators: number of trees (start with 100-500)
- max_depth: maximum tree depth (None, or 10-50)
- min_samples_split: minimum samples to split a node (2-10)
- min_samples_leaf: minimum samples in leaf node (1-5)
- max_features: features to consider for splits (sqrt, log2, or fraction)
XGBoost:
- n_estimators: number of boosting rounds (100-1000)
- learning_rate: step size (0.01-0.3)
- max_depth: tree depth (3-10)
- min_child_weight: minimum sum of instance weight (1-10)
- subsample: fraction of samples per tree (0.5-1.0)
- colsample_bytree: fraction of features per tree (0.5-1.0)
- gamma: minimum loss reduction (0-5)