How to use softmax activation function at the output layer, but relus in the middle layers in TensorFlow?

谁说胖子不能爱 提交于 2019-12-12 03:08:51

问题


I have a neural net of 3 hidden layers (so I have 5 layers in total). I want to use Rectified Linear Units at each of the hidden layers, but at the outermost layer I want to apply Softmax on the logits. I want to use the DNNClassifier. I have read the official documentation of the TensorFlow where for setting value of the parameter activation_fn they say:

activation_fn: Activation function applied to each layer. If None, will use tf.nn.relu.

I know I can always write my own model and use any arbitrary combination of the activation functions. But as the DNNClassifier is more concrete, I want to resort to that. So far I have:

classifier = tf.contrib.learn.DNNClassifier(
  feature_columns=features_columns,
  hidden_units=[10,20,10],
  n_classes=3
  # , activation_fn:::: I want something like below
  # activation_fn = [relu,relu,relu,softmax]
)

回答1:


Sorry to say, but this is not possible using only one DNNClassifier. As you show in your example, you can supply an activation_fn

Activation function applied to each layer. If None, will use tf.nn.relu.

But not a seperate one for each layer. To solve your problem, you have to chain this classifier to another layer that does have the tanh actication function.



来源:https://stackoverflow.com/questions/42697341/how-to-use-softmax-activation-function-at-the-output-layer-but-relus-in-the-mid

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!