How to use k-fold cross validation in a neural network

前端 未结 2 846
甜味超标
甜味超标 2020-12-04 08:48

We are writing a small ANN which is supposed to categorize 7000 products into 7 classes based on 10 input variables.

In order to do this we have to use k-fold cross

2条回答
  •  孤城傲影
    2020-12-04 09:24

    1. Divide your data into K non-overlapping folds. Have each fold K contain an equal number of items from each of the m classes (stratified cross-validation; if you have 100 items from class A and 50 from class B and you do 2 fold validation, each fold should contain a random 50 items from A and 25 from B).

      1. For i in 1..k:

        • Designate fold i the test fold
        • Designate one of the remaining k-1 folds the validation fold (this can either be random or a function of i, doesn't really matter)
        • Designate all remaining folds the training fold
        • Do a grid search for all free parameters (e.g. learning rate, # of neurons in hidden layer) training on your training data and computing loss on your validation data. Pick parameters minimising loss
        • Use the classifier with the winning parameters to evaluate test loss. Accumulate results

    You have now collected aggregate results across all the folds. This is your final performance. If you're going to apply this for real, in the wild, use the best parameters from the grid search to train on all the data.

提交回复
热议问题