This article explores the impact of hyperparameters on random forests, both in terms of performance and visual representation. It compares the performance of a default random forest with tuned decision trees and examines the effects of various hyperparameters like `n_estimators`, `max_depth`, and `ccp_alpha` using visualizations of individual trees, predictions, and errors.
Additive Decision Trees are a variation of standard decision trees, constructed in a way that can often allow them to be more accurate, more interpretable, or both. This article explains the intuition behind Additive Decision Trees and how they can be constructed.