Home > database >  Random Forest Classifier: Set feature importances?
Random Forest Classifier: Set feature importances?

Time:11-29

On a RFC model, I am trying to figure out how the feature importances change my classification when i am perturbing my data, like

features(no perturbation)= features(perturbed data)-features(perturbation)

Then using the features(no perturbation) on my already fit model. Do you if it is possible to manually set or change the feature importances of an RFC model ? I tried looking for but no results.

Thank you.

CodePudding user response:

The general convention in scikit-learn code is that attributes that are inferred from your data / training end with _. feature_importances_ attributes respects that convention as well. They represent impurity-based importances and they are computed / inferred from your training set statistics.

You have the option to act on the weight you give to the different samples through sample_weight argument as well as weighting your classes through class_weight parameter.

  • Related