How to calculate prediction uncertainty using Keras?

前端 未结 4 925
再見小時候
再見小時候 2020-12-13 00:08

I would like to calculate NN model certainty/confidence (see What my deep model doesn\'t know) - when NN tells me an image represents \"8\", I would like to know how certain

相关标签:
4条回答
  • 2020-12-13 00:35

    Your model uses a softmax activation, so the simplest way to obtain some kind of uncertainty measure is to look at the output softmax probabilities:

    probs = model.predict(some input data)[0]
    

    The probs array will then be a 10-element vector of numbers in the [0, 1] range that sum to 1.0, so they can be interpreted as probabilities. For example the probability for digit 7 is just probs[7].

    Then with this information you can do some post-processing, typically the predicted class is the one with highest probability, but you can also look at the class with second highest probability, etc.

    0 讨论(0)
  • 2020-12-13 00:49

    Made a few changes to the top voted answer. Now it works for me.

    It's a way to estimate model uncertainty. For other source of uncertainty, I found https://eng.uber.com/neural-networks-uncertainty-estimation/ helpful.

    f = K.function([model.layers[0].input, K.learning_phase()],
                   [model.layers[-1].output])
    
    
    def predict_with_uncertainty(f, x, n_iter=10):
        result = []
    
        for i in range(n_iter):
            result.append(f([x, 1]))
    
        result = np.array(result)
    
        prediction = result.mean(axis=0)
        uncertainty = result.var(axis=0)
        return prediction, uncertainty
    
    0 讨论(0)
  • 2020-12-13 00:58

    If you want to implement dropout approach to measure uncertainty you should do the following:

    1. Implement function which applies dropout also during the test time:

      import keras.backend as K
      f = K.function([model.layers[0].input, K.learning_phase()],
                     [model.layers[-1].output])
      
    2. Use this function as uncertainty predictor e.g. in a following manner:

      def predict_with_uncertainty(f, x, n_iter=10):
          result = numpy.zeros((n_iter,) + x.shape)
      
          for iter in range(n_iter):
              result[iter] = f(x, 1)
      
          prediction = result.mean(axis=0)
          uncertainty = result.var(axis=0)
          return prediction, uncertainty
      

    Of course you may use any different function to compute uncertainty.

    0 讨论(0)
  • 2020-12-13 00:58

    A simpler way is to set training=True on any dropout layers you want to run during inference as well (essentially tells the layer to operate as if it's always in training mode - so it is always present for both training and inference).

    import keras
    
    inputs = keras.Input(shape=(10,))
    x = keras.layers.Dense(3)(inputs)
    outputs = keras.layers.Dropout(0.5)(x, training=True)
    
    model = keras.Model(inputs, outputs)
    

    Code above is from this issue.

    0 讨论(0)
提交回复
热议问题