两种方式:
一、通过keras封装的网络层中的activation参数指定:
例如,下面的卷积层中的指定的激活函数为ReLU函数:
from keras.model import Sequential from keras.layers import Conv2D from keras.layers import MaxPooling2D model = Sequential() model.add(Conv2D( kernel_size=(9,9), activation="relu", filters=48, strides=(4,4), input_shape = input_shape)) model.add(MaxPooling2D((3,3), strides=(2,2), padding='same')) model.add(Conv2D( strides=(1,1), kernel_size=(3,3), activation="relu", filters=128)) model.add(Conv2D( strides=(1,1), kernel_size=(3,3), activation="relu", filters=128)) model.add(MaxPooling2D((3,3), strides=(2,2), padding='same'))
二、通过Activation激活层单独封装:
from keras.layers import Activation, Dense model.add(Dense(64)) model.add(Activation('sigmoid'))
model.add(Dense(64, activity_regularizer='sigmoid'))
文章来源: https://blog.csdn.net/zfjBIT/article/details/91790183