keras-layer

Transfer learning, wrong dense layer's shape

谁说胖子不能爱 提交于 2019-12-24 06:10:50
问题 I am trying to apply transfer learning to my ANN for image classification. I have found an example of it, and I would personalize the network. Here there are the main blocks of code: model = VGG19(weights='imagenet', include_top=False, input_shape=(224, 224, 3)) batch_size = 16 for layer in model.layers[:5]: layer.trainable = False x = model.output x = Flatten()(x) x = Dense(1024, activation="relu")(x) x = Dense(1024, activation="relu")(x) predictions = Dense(16, activation="sigmoid")(x)

How to chain/compose layers in keras 2 functional API without specifying input (or input shape)

烈酒焚心 提交于 2019-12-24 00:36:43
问题 I would like be able to several layers together, but before specifying the input, something like the following: # conv is just a layer, no application conv = Conv2D(64, (3,3), activation='relu', padding='same', name='conv') # this doesn't work: bn = BatchNormalization()(conv) Note that I don't want to specify the input nor its shape if it can be avoided, I want to use this as a shared layer for multiple inputs at a later point. Is there a way to do that? The above gives the following error: >

Specifying a seq2seq autoencoder. What does RepeatVector do? And what is the effect of batch learning on predicting output?

柔情痞子 提交于 2019-12-23 22:34:14
问题 I am building a basic seq2seq autoencoder, but I'm not sure if I'm doing it correctly. model = Sequential() # Encoder model.add(LSTM(32, activation='relu', input_shape =(timesteps, n_features ), return_sequences=True)) model.add(LSTM(16, activation='relu', return_sequences=False)) model.add(RepeatVector(timesteps)) # Decoder model.add(LSTM(16, activation='relu', return_sequences=True)) model.add(LSTM(32, activation='relu', return_sequences=True)) model.add(TimeDistributed(Dense(n_features)))'

keras model.get_weight is not returning results in expected dimensions

无人久伴 提交于 2019-12-23 22:01:07
问题 I am doing classification over mnist dataset using keras. I am interested in doing some operation on weight matrix generated after the training but some layers weight matrix looks like they are not fully connected. model = Sequential() model.add(Dense(1000, input_shape = (train_x.shape[1],), activation='relu' )) model.add(Dense(1000, activation='relu')) model.add(Dense(500, activation='relu')) model.add(Dense(200, activation='relu')) model.add(Dense(10, activation='softmax')) model.compile

ValueError: Error when checking model target: expected dense_4 to have shape (None, 4) but got array with shape (13252, 1)

会有一股神秘感。 提交于 2019-12-23 18:31:38
问题 Hi does anyone have any ideas why this error is happening? Here is the error ValueError: Error when checking model target: expected dense_4 to have shape (None, 4) but got array with shape (13252, 1) And here is the code: from keras.models import Sequential from keras.layers import * model = Sequential() model.add(Cropping2D(cropping=((0,0), (50,20)), input_shape=(160 ,320, 3))) #(None, 90, 320, 3) model.add(Lambda(lambda x: x/127.5 - 1.)) model.add(Convolution2D(32, 3, 3,)) #(None, 88, 318,

Copying weights from one Conv2D layer to another

只愿长相守 提交于 2019-12-23 12:23:55
问题 Context I have trained a model on MNIST using Keras. My goal is to print images after the first layer with the first layer being a Conv2D layer. To go about this I'm creating a new model with a single Conv2D layer in which I'll copy the weights from the trained network into the new one. # Visualization for image ofter first convolution model_temp = Sequential() model_temp.add(Conv2D(32, (3, 3), activation='relu', input_shape=(28,28,1,))) trained_weights = model.layers[0].get_weights()[0]

Copying weights from one Conv2D layer to another

一世执手 提交于 2019-12-23 12:21:07
问题 Context I have trained a model on MNIST using Keras. My goal is to print images after the first layer with the first layer being a Conv2D layer. To go about this I'm creating a new model with a single Conv2D layer in which I'll copy the weights from the trained network into the new one. # Visualization for image ofter first convolution model_temp = Sequential() model_temp.add(Conv2D(32, (3, 3), activation='relu', input_shape=(28,28,1,))) trained_weights = model.layers[0].get_weights()[0]

How can I implement KL-divergence regularization for Keras?

眉间皱痕 提交于 2019-12-23 03:00:31
问题 This is a follow-up question for this question Keras backend mean function: " 'float' object has no attribute 'dtype' "? I am trying to make a new regularizer for Keras. Here is my code import keras from keras import initializers from keras.models import Model, Sequential from keras.layers import Input, Dense, Activation from keras import regularizers from keras import optimizers from keras import backend as K kullback_leibler_divergence = keras.losses.kullback_leibler_divergence def kl

Understanding output of Dense layer for higher dimension

你。 提交于 2019-12-23 02:52:57
问题 I don't have problem in understanding output shape of a Dense layer followed by a Flatten layer. Output shape is in accordance of my understanding i.e (Batch size, unit). nn= keras.Sequential() nn.add(keras.layers.Conv2D(8,kernel_size=(2,2),input_shape=(4,5,1))) nn.add(keras.layers.Conv2D(1,kernel_size=(2,2))) nn.add(keras.layers.Flatten()) nn.add(keras.layers.Dense(5)) nn.add(keras.layers.Dense(1)) nn.summary() Output is: _________________________________________________________________

Mixing numerical and categorical data into keras sequential model with Dense layers

瘦欲@ 提交于 2019-12-23 02:03:32
问题 I have a training set in a Pandas dataframe, and I pass this data frame into model.fit() with df.values . Here is some information about the df: df.values.shape # (981, 5) df.values[0] # array([163, 0.6, 83, 0.52, # array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, # 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, # 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, # 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, # 0,