caret::train: specify further non-tuning parameters for mlpWeightDecay (RSNNS package)

送分小仙女□ 提交于 2019-12-03 09:42:31

It looks like you can specify your own learnFuncParams in "...". caret checks if you've provided your own set of parameters and will only override learnFuncParams[3] (which is the decay). It will take the learnFuncParams[1,2,4] that you have provided.

A very convenient way to find out what caret does is to type getModelInfo("mlpWeightDecay") and then scroll up to the $mlpWeightDecay$fit part. It shows how caret will call the real training function:

$mlpWeightDecay$fit
    if (any(names(theDots) == "learnFuncParams")) {
        prms <- theDots$learnFuncParams
        prms[3] <- param$decay
        warning("Over-riding weight decay value in the 'learnFuncParams' argument you passed in. Other values are retained")
    }

It checks if you've provided your own learnFuncParams. If you did, it uses it, but inserts its own decay. You can ignore the warning.

I think the error you've got ("final tuning parameters could not be determined") has another reason. Have you tried a lower learning rate?

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!