How is the hidden layer size determined for MLPRegressor in SciKitLearn?

别说谁变了你拦得住时间么 提交于 2021-02-05 09:43:02

问题


Lets say I'm creating a neural net using the following code:

from sklearn.neural_network import MLPRegressor

model = MLPRegressor(
  hidden_layer_sizes=(100,),
  activation='identity'
)
model.fit(X_train, y_train)

For the hidden_layer_sizes, I simply set it to the default. However, I don't really understand how it works. What is the number of hidden layers in my definition? Is it 100?


回答1:


From the docs:

hidden_layer_sizes : tuple, length = n_layers - 2, default (100,)

The ith element represents the number of neurons in the ith hidden layer.

It is length = n_layers - 2, because the number of your hidden layers is the total number of layers n_layers minus 1 for your input layer, minus 1 for your output layer.

In your (default) case of (100,), it means one hidden layer of 100 units (neurons).

For 3 hidden layers of, say, 100, 50, and 25 units respectively, it would be

hidden_layer_sizes = (100, 50, 25)

See the example in the docs (it is for MLPClassifier, but the logic is identical).



来源:https://stackoverflow.com/questions/55786860/how-is-the-hidden-layer-size-determined-for-mlpregressor-in-scikitlearn

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!