Applying k-fold Cross Validation model using caret package

前端 未结 3 1261
-上瘾入骨i
-上瘾入骨i 2020-12-23 15:50

Let me start by saying that I have read many posts on Cross Validation and it seems there is much confusion out there. My understanding of that it is simply this:

相关标签:
3条回答
  • 2020-12-23 15:58

    An important thing to be noted here is not confuse model selection and model error estimation.

    You can use cross-validation to estimate the model hyper-parameters (regularization parameter for example).

    Usually that is done with 10-fold cross validation, because it is good choice for the bias-variance trade-off (2-fold could cause models with high bias, leave one out cv can cause models with high variance/over-fitting).

    After that, if you don't have an independent test set you could estimate an empirical distribution of some performance metric using cross validation: once you found out the best hyper-parameters you could use them in order to estimate de cv error.

    Note that in this step the hyperparameters are fixed but maybe the model parameters are different accross the cross validation models.

    0 讨论(0)
  • 2020-12-23 16:03

    when you perform k-fold cross validation you are already making a prediction for each sample, just over 10 different models (presuming k = 10). There is no need make a prediction on the complete data, as you already have their predictions from the k different models.

    What you can do is the following:

    train_control<- trainControl(method="cv", number=10, savePredictions = TRUE)
    

    Then

    model<- train(resp~., data=mydat, trControl=train_control, method="rpart")
    

    if you want to see the observed and predictions in a nice format you simply type:

    model$pred
    

    Also for the second part of your question, caret should handle all the parameter stuff. You can manually try tune parameters if you desire.

    0 讨论(0)
  • 2020-12-23 16:23

    In the first page of the short introduction document for caret package, it is mentioned that the optimal model is chosen across the parameters. As a starting point, one must understand that cross-validation is a procedure for selecting best modeling approach rather than the model itself CV - Final model selection. Caret provides grid search option using tuneGrid where you can provide a list of parameter values to test. The final model will have the optimized parameter after training is done.

    0 讨论(0)
提交回复
热议问题