Error when checking target: expected dense_3 to have shape (3,) but got array with shape (1,)

后端 未结 9 1779
不思量自难忘°
不思量自难忘° 2020-12-04 17:26

I am working on training a VGG16-like model in Keras, on a 3 classes subset from Places205, and encountered the following error:

ValueError: Error when chec         


        
相关标签:
9条回答
  • 2020-12-04 18:03

    The reason for this is you would have used 'binary' class_mode in the fit_generator() method for a multi class problem. Change that to 'categorical' and the error goes.

    0 讨论(0)
  • 2020-12-04 18:10

    Had the same issue. To solve the problem you can simply change in validation_generator and train_generator the class mode from 'binary' to 'categorical' - that's because you have 3 classes-which is not binary.

    0 讨论(0)
  • 2020-12-04 18:15

    I also got the same error and solved it by setting class_mode as categorical instead of binary

    0 讨论(0)
  • 2020-12-04 18:16

    The problem is with the shape of the labels of the data "Y".
    The shape you have for the labels are (m,) and this will not work with the:

    loss = "binary_crossentropy"
    

    I believe if you don't want to play with the shape of the labels, then use:

    loss = "sparse_categorical_crossentropy"
    
    0 讨论(0)
  • 2020-12-04 18:18

    As mentioned by others, Keras expects "one hot" encoding in multiclass problems.

    Keras comes with a handy function to recode labels:

    print(train_labels)
    [1. 2. 2. ... 1. 0. 2.]
    
    print(train_labels.shape)
    (2000,)
    

    Recode labels using to_categorical to get the correct shape of inputs:

    from keras.utils import to_categorical
    train_labels = to_categorical(train_labels)
    
    print(train_labels)
    [[0. 1. 0.]
     [0. 0. 1.]
     [0. 0. 1.]
     ...
     [0. 1. 0.]
     [1. 0. 0.]
     [0. 0. 1.]]
    
    print(train_labels.shape)
    (2000, 3)  # viz. 2000 observations, 3 labels as 'one hot'
    

    Other importent things to change/check in multiclass (compared to binary classification):

    Set class_mode='categorical' in the generator() function(s).

    Don't forget that the last dense layer must specify the number of labels (or classes):

    model.add(layers.Dense(3, activation='softmax'))
    

    Make sure that activation= and loss= is chosen so to suit multiclass problems, usually this means activation='softmax' and loss='categorical_crossentropy'.

    0 讨论(0)
  • 2020-12-04 18:19

    For me, this worked.

    from keras.utils import to_categorical
    
    num_labels=10 #for my case
    
    train_labels=to_categorical(train_labels,10)
    test_labels=to_categorical(test_labels,10)
    

    Specifying the number of labels as an argument while categorically encoding my labels helped train effectively on my training set.

    0 讨论(0)
提交回复
热议问题