klotz: shap*

The SHAP explainability algorithm or one of its implementations, such as a Python library. The SHAP algorithm is based on Shapley Functions.

0 bookmark(s) - Sort by: Date ↓ / Title / - Bookmarks from other users for this tag

  1. This article explores the use of Isolation Forest for anomaly detection and how SHAP (KernelSHAP and TreeSHAP) can be applied to explain the anomalies detected, providing insights into which features contribute to anomaly scores.
  2. This article explores how stochastic regularization in neural networks can improve performance on unseen categorical data, especially high-cardinality categorical features. It uses visualizations and SHAP values to understand how entity embeddings respond to this regularization technique.
  3. Generating counterfactual explanations got a lot easier with CFNOW, but what are counterfactual explanations, and how can I use them?
  4. 2021-10-08 Tags: by klotz

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: Tags: shap

About - Propulsed by SemanticScuttle