In statistics and machine learning, ensemble methods use multiple learning algorithms to obtain better predictive performance than could be obtained from any of the constituent learning algorithms alone (Wiki)
Boosting (Wiki
- The Strength of Weak Learnability (1990) Robert Schapire
- Arcing the edge (1997) Leo Breiman
- A Short Introduction to Boosting (1999) Yoav Freund, Robert E. Schapire
- Boosting Algorithms as Gradient Descent (2000) Llew Mason, Jonathan Baxter, Peter Bartlett, Marcus Frean
- AdaBoost (Wiki)
- Boosting the margin: A new explanation for the effectiveness of voting methods (1998) Robert E. Schapire, Yoav Freund, Peter Bartlett, Wee Sun Lee
- Random Forests (Wiki, CRAN, PyPI)
- Random Forests (2001) Leo Breiman
- Overview of Random Forest Methodology and Practical Guidance with Emphasis on Computational Biology and Bioinformatic (2012) Anne-Laure Boulesteix, Silke Janitza, Jochen Kruppa, Inke R. Konig
- Variable selection using random forests (2010) Robin Genuer, Jean-Michel Poggi, Christine Tuleau-Malot
- Bias in random forest variable importance measures: Illustrations, sources and a solution (2007) Carolin Strobl, Anne-Laure Boulesteix, Achim Zeileis and Torsten Hothorn
- Package: ranger (CRAN)
- Package: randomForest (CRAN, Paper) (2002) Andy Liaw, Matthew Wiener. Based on work by Leo Breiman and Adele Cutler
- Generalized Random Forests (Code, CRAN)
- Generalized Random Forests (2016) Susan Athey, Julie Tibshirani, Stefan Wager
- Gradient Boosting, Stochastic Gradient Boosting (Wiki)
- Greedy Function Approximation: A Gradient Boosting Machine (1999) Jerome H. Friedman
- Stochastic Gradient Descent (1999) Jerome H. Friedman
- XGBoost: A Scalable Tree Boosting System (2016) Tianqi Chen, Carlos Guestrin
- Out-of-Core GPU Gradient Boosting (2020) Rong Ou
- Package: XGBoost (Wiki, Code)
- Package: CatBoost
- Package: LightGBM