Is there any way to get variable importance with Keras?

后端 未结 4 1125
渐次进展
渐次进展 2020-12-23 17:51

I am looking for a proper or best way to get variable importance in a Neural Network created with Keras. The way I currently do it is I just take the weights (not the biases

4条回答
  •  天命终不由人
    2020-12-23 18:04

    Since everything will be mixed up along the network, the first layer alone can't tell you about the importance of each variable. The following layers can also increase or decrease their importance, and even make one variable affect the importance of another variable. Every single neuron in the first layer itself will give each variable a different importance too, so it's not something that straightforward.

    I suggest you do model.predict(inputs) using inputs containing arrays of zeros, making only the variable you want to study be 1 in the input.

    That way, you see the result for each variable alone. Even though, this will still not help you with the cases where one variable increases the importance of another variable.

提交回复
热议问题