Tags: feature importance*

0 bookmark(s) - Sort by: Date ↓ / Title /

  1. This article explains permutation feature importance (PFI), a popular method for understanding feature importance in explainable AI. The author walks through calculating PFI from scratch using Python and XGBoost, discussing the rationale behind the method and its limitations.
  2. Discussion on the efficiency of Random Forest algorithms for PCA and Feature Importance. By Christopher Karg for Towards Data Science.
  3. Cool question - and yes, you're right that you can use the summary command to inspect feature_importances for some of the models (e.g. RandomForestClassifier). Other models may not support the same type of summary however.

    You should also check out the FieldSelector algorithm which is really useful for this problem. Under the hood, it uses ANOVA & F-Tests to estimate the linear dependency between variables. Although its univariate (not capturing any interactions between variables), it still can provide a good baseline from choosing a handful of features from hundreds.

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: tagged with "feature importance"

About - Propulsed by SemanticScuttle