Is there any way to get variable importance with Keras?

后端 未结 4 1121
渐次进展
渐次进展 2020-12-23 17:51

I am looking for a proper or best way to get variable importance in a Neural Network created with Keras. The way I currently do it is I just take the weights (not the biases

4条回答
  •  时光取名叫无心
    2020-12-23 18:17

    *Edited to include relevant code to implement permutation importance.

    I answered a similar question at Feature Importance Chart in neural network using Keras in Python. It does implement what Teque5 mentioned above, namely shuffling the variable among your sample or permutation importance using the ELI5 package.

    from keras.wrappers.scikit_learn import KerasClassifier, KerasRegressor
    import eli5
    from eli5.sklearn import PermutationImportance
    
    def base_model():
        model = Sequential()        
        ...
        return model
    
    X = ...
    y = ...
    
    my_model = KerasRegressor(build_fn=basemodel, **sk_params)    
    my_model.fit(X,y)
    
    perm = PermutationImportance(my_model, random_state=1).fit(X,y)
    eli5.show_weights(perm, feature_names = X.columns.tolist())
    

提交回复
热议问题