Azure ML Tune Model Hyper Parameters

佐手、 提交于 2020-01-06 08:04:18

问题


Here's question proposed at the end of the chapter in 70-774 exam reference book.

If you connect a neural network with a Tune Model Hyperparameters module configured with Random Sweep and Maximum number of runs on random sweep = 1, how many neural networks are trained during the execution of the experiment? Why? If you connect a validation dataset to the third input of the Tune Model Hyperparameters module, how many neural networks are trained now?

And the answer is :

Without validation dataset 11 (10 of k-fold cross validation + 1 trained with all the data with the best combination of hyperparameters). With the validation set only 1 neural network is trained, so the best model is not trained using the validation set if you provide it.

Where does 10 come from? As far as I understand the number should be 2 and 1 respectively. Shouldn't it create n-folds where n is equal to the number of runs?


回答1:


When you use the Tune Model Hyperparameters module without a validation dataset, this means, when you use only the 2nd input data port, the module works in cross-validation mode. So the best-parameters model is found by doing cross-validation over the provided dataset, and to do this, the dataset is splitted in k-folds. By default, the module splits the data in 10 folds. In case you want to split the data in a different number of folds, you can connect a Partition and Sample module at the 2nd input, selecting Assign to Folds and indicating the number of folds desired. In many cases k=5 is a reasonable option.



来源:https://stackoverflow.com/questions/52705769/azure-ml-tune-model-hyper-parameters

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!