This interactive demo showcases XGBoost (Extreme Gradient Boosting) algorithms for both classification and regression tasks. Explore advanced gradient boosting with optimized performance, regularization techniques, and comprehensive visualizations for real-time predictions.
๐ How to Use: Select data โ Configure target โ Set XGBoost parameters โ Enter new point โ Run prediction!
Start with sample datasets or upload your own CSV/Excel files.
๐๏ธ Sample Datasets
๐ฏ Target Column
๐ Loading sample data...
๐ Data Preview (First 5 Rows)
๐ Data Preview (First 5 Rows)
๐ XGBoost Parameters & Inputโผ
๐ XGBoost Parameters
โบ
0.011
โบ
0.51
โบ
0.51
๐ Data Split Configuration
โบ
0.60.9
Display train/validation set information
๐ Run Prediction
๐ XGBoost Results & Visualization
๐ณ Select Tree to Visualize
**๐ XGBoost Process**
XGBoost details will appear here showing how the prediction builds up.
๐ XGBoost Tips:
๐ Progress Chart: Shows how predictions evolve as each tree is added to the ensemble.
๐ Loss Chart: Monitor training and validation loss to understand model convergence.
๐ณ Individual Tree Visualization: Select any tree to see its structure and contribution.
๐ Feature Importance: Displays which features are most influential across all trees.
๐ฏ Parameter Tuning: Try different number of trees (up to 100) and learning rate (0.01-1.0).
โก Learning Rate (eta): Default 0.3 works well; lower values need more trees but better generalization.
๐ฒ Tree Depth: Max depth 6 is default; deeper trees can capture more complex patterns but may overfit.
๐ก๏ธ Regularization: Min child weight, subsample, and colsample_bytree help prevent overfitting.
๐ Tree Analysis: Use the tree selector to understand how each tree contributes to predictions.