Skip to content

Commit

Permalink
added feature importance answer for boosting (#166)
Browse files Browse the repository at this point in the history
* added feature importance answer for boosting

* Update theory.md

reverted

---------

Co-authored-by: Roman Polyansky <[email protected]>
Co-authored-by: Alexey Grigorev <[email protected]>
  • Loading branch information
3 people authored May 6, 2024
1 parent 111cc12 commit 15737bc
Showing 1 changed file with 5 additions and 1 deletion.
6 changes: 5 additions & 1 deletion theory.md
Original file line number Diff line number Diff line change
Expand Up @@ -719,7 +719,11 @@ Yes, different frameworks provide different options to make training faster, usi

**Feature importance in gradient boosting trees  —  what are possible options? ‍⭐️**

Answer here
With CatBoost you can use implemented method get_feature_importance for getting SHAP values. https://arxiv.org/abs/1905.04610v1

It allows to understand how excluding features helps to provide better results. Higher value is better.

Also you can add random noise column to your data (with normal distribution), calculate feature importance and exclude all features below noise importance.

<br/>

Expand Down

0 comments on commit 15737bc

Please sign in to comment.