Keras ValueError: No gradients provided for any variable

£可爱£侵袭症+ 提交于 2021-01-28 09:01:14

问题


I've read related threads but not been able to solve my problem.

I'm currently trying to get my model to run in order to classify 5000 different events, which all currently fall under the same category (so my "labels" dataset consists of 5000 1s).

I'm using one hot encoding for my labels data set:

labels = np.loadtxt("/content/drive/My Drive/5000labels1.csv")

from keras.utils import to_categorical
labels=to_categorical(labels) # convert labels to one-hot encoding

I then define my model like so:

inputs = keras.Input(shape=(29,29,1))

x=inputs

x = keras.layers.Conv2D(16, kernel_size=(3,3), name='Conv_1')(x)
x = keras.layers.LeakyReLU(0.1)(x)      
x = keras.layers.MaxPool2D((2,2), name='MaxPool_1')(x)

x = keras.layers.Conv2D(16, kernel_size=(3,3), name='Conv_2')(x)
x = keras.layers.LeakyReLU(0.1)(x)
x = keras.layers.MaxPool2D((2,2), name='MaxPool_2')(x)

x = keras.layers.Conv2D(32, kernel_size=(3,3), name='Conv_3')(x)
x = keras.layers.LeakyReLU(0.1)(x)
x = keras.layers.MaxPool2D((2,2), name='MaxPool_3')(x)
x = keras.layers.Flatten(name='Flatten')(x)

x = keras.layers.Dense(64, name='Dense_1')(x)
x = keras.layers.ReLU(name='ReLU_dense_1')(x)
x = keras.layers.Dense(64, name='Dense_2')(x)
x = keras.layers.ReLU(name='ReLU_dense_2')(x)

outputs = keras.layers.Dense(4, activation='softmax', name='Output')(x)

model = keras.Model(inputs=inputs, outputs=outputs, name='VGGlike_CNN')
model.summary()

keras.utils.plot_model(model, show_shapes=True)

OPTIMIZER = tf.keras.optimizers.Adam(learning_rate=LR_ST)

model.compile(optimizer=OPTIMIZER,
              loss='categorical_crossentropy',
              metrics=['accuracy'],
              run_eagerly=False)

def lr_decay(epoch):
  if epoch < 10:
    return LR_ST
  else:
    return LR_ST * tf.math.exp(0.2 * (10 - epoch))

lr_scheduler = keras.callbacks.LearningRateScheduler(lr_decay)


model_checkpoint = keras.callbacks.ModelCheckpoint(
        filepath='mycnn_best',
        monitor='val_accuracy',
        save_weights_only=True, 
        save_best_only=True,
        save_freq='epoch')

callbacks = [ lr_scheduler, model_checkpoint ]    

print('X_train.shape = ',X_train.shape)

history = model.fit(X_train, epochs=50,
                    validation_data=X_test, shuffle=True, verbose=1,
                    callbacks=callbacks)

I get the error: "No gradients provided for any variable: ['Conv_1_2/kernel:0', 'Conv_1_2/bias:0', 'Conv_2_2/kernel:0', 'Conv_2_2/bias:0', 'Conv_3_2/kernel:0', 'Conv_3_2/bias:0', 'Dense_1_2/kernel:0', 'Dense_1_2/bias:0', 'Dense_2_2/kernel:0', 'Dense_2_2/bias:0', 'Output_2/kernel:0', 'Output_2/bias:0']. "

From what I've read, it seems most likely due to a problem with the loss function - but I don't understand what the problem can be. Eventually I want the network to classify events into one of 4 categories, so I used the categorical cross-entropy in order to get a probability associated with each value of number of events.

Can anyone help me? If needed I can provide a link to the google colab file of my original code.

Thanks in advance!


回答1:


you miss your target

model.fit(X_train, y_train, ..., validation_data = (X_test, y_test))


来源:https://stackoverflow.com/questions/62641596/keras-valueerror-no-gradients-provided-for-any-variable

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!