I am used to using tf.contrib.layers.fully_connected to build a fully connected layer. Recently I ran into tf.layers.dense apparently used where the first functioned could b
They are essentially the same, the later calling the former.
However tf.contrib.fully_connected
adds a few functionalities on top of dense
, in particular the possibility to pass a normalization and an activation in the parameters, à la Keras. As noted by @wordforthewise, mind that the later defaults to tf.nn.relu
.
More generally, the TF API proposes (and mixes somewhat confusingly) low- and hi-level APIs; more on that here.