keras-layer

Copying weights of a specific layer - keras

主宰稳场 提交于 2019-12-06 02:24:47
According to this the following copies weights from one model to another: target_model.set_weights(model.get_weights()) What about copying the weights of a specific layer, would this work? model_1.layers[0].set_weights(source_model.layers[0].get_weights()) model_2.layers[0].set_weights(source_model.layers[0].get_weights()) If I train model_1 and model_2 will they have separate weights? The documentation doesn't state whether if this get_weights makes a deep copy or not. If this doesn't work, how can this be achieved? Of course, it would be a copy of the weights. It does not make sense the

Cannot add layers to saved Keras Model. 'Model' object has no attribute 'add'

时光怂恿深爱的人放手 提交于 2019-12-06 01:15:20
问题 I have a saved a model using model.save() . I'm trying to reload the model and add a few layers and tune some hyper-parameters, however, it throws the AttributeError. Model is loaded using load_model() . I guess I'm missing understanding how to add layers to saved layers. If someone can guide me here, it will be great. I'm a novice to deep learning and using keras, so probably my request would be silly. Snippet: prev_model = load_model('final_model.h5') # loading the previously saved model.

ValueError: Tensor Tensor(…) is not an element of this graph. When using global variable keras model

二次信任 提交于 2019-12-05 18:45:59
I'm running a web server using flask and the error comes up when I try to use vgg16, which is the global variable for keras' pre-trained VGG16 model. I have no idea why this error rises or whether it has anything to do with the Tensorflow backend. Here is my code: vgg16 = VGG16(weights='imagenet', include_top=True) def getVGG16Prediction(img_path): global vgg16 img = image.load_img(img_path, target_size=(224, 224)) x = image.img_to_array(img) x = np.expand_dims(x, axis=0) x = preprocess_input(x) pred = vgg16.predict(x) return x, sort(decode_predictions(pred, top=3)[0]) @app.route("

Keras: Lambda layer function with multiple parameters

僤鯓⒐⒋嵵緔 提交于 2019-12-05 17:42:56
问题 I am trying to write a Lambda layer in Keras which calls a function connection , that runs a loop for i in range(0,k) where k is fed in as an input to the function, connection(x,k) . Now, when I try to call the function in the Functional API, I tried using: k = 5 y = Lambda(connection)(x) Also, y = Lambda(connection)(x,k) But neither of those approaches worked. How can I feed in the value of k without assigning it as a global parameter? 回答1: Just use y = Lambda(connection)((x,k)) and then var

Add a resizing layer to a keras sequential model

我只是一个虾纸丫 提交于 2019-12-05 16:31:34
问题 How can I add a resizing layer to model = Sequential() using model.add(...) To resize an image from shape (160, 320, 3) to (224,224,3) ? 回答1: Normally you would use the Reshape layer for this: model.add(Reshape((224,224,3), input_shape=(160,320,3)) but since your target dimensions don't allow to hold all the data from the input dimensions ( 224*224 != 160*320 ), this won't work. You can only use Reshape if the number of elements does not change. If you are fine with losing some data in your

Multi-Output Multi-Class Keras Model

和自甴很熟 提交于 2019-12-05 13:49:02
For each input I have, I have a 49x2 matrix associated. Here's what 1 input-output couple looks like input : [Car1, Car2, Car3 ..., Car118] output : [[Label1 Label2] [Label1 Label2] ... [Label1 Label2]] Where both Label1 and Label2 are LabelEncode and they have respectively 1200 and 1300 different classes. Just to make sure this is what we call a multi-output multi-class problem? I tried to flatten the output but I feared the model wouldn't understand that all similar Label share the same classes. Is there a Keras layer that handle output this peculiar array shape? Generally, multi-class

Keras Custom Layer 2D input -> 2D output

心不动则不痛 提交于 2019-12-05 13:31:22
I have an 2D input (or 3D if one consider the number of samples) and I want to apply a keras layer that would take this input and outputs another 2D matrix. So, for example, if I have an input with size (ExV), the learning weight matrix would be (SxE) and the output (SxV). Can I do this with Dense layer? EDIT (Nassim request): The first layer is doing nothing. It's just to give an input to Lambda layer: from keras.models import Sequential from keras.layers.core import Reshape,Lambda from keras import backend as K from keras.models import Model input_sample = [ [[1,2,3,4,5],[6,7,8,9,10],[11,12

How to merge two LSTM layers in Keras

廉价感情. 提交于 2019-12-05 10:57:29
I’m working with Keras on a sentence similarity task (using the STS dataset) and am having problems merging the layers. The data consists of 1184 sentence pairs each scored between 0 and 5. Below are the shapes of my numpy arrays. I’ve padded each of the sentences to 50 words and run them through and embedding layer, using the glove embedding’s with 100 dimensions. When merging the two networks I'm getting an error.. Exception: Error when checking model input: the list of Numpy arrays that you are passing to your model is not the size the model expected. Expected to see 1 arrays but instead

Average weights in keras models

China☆狼群 提交于 2019-12-05 07:56:48
How to average weights in Keras models, when I train few models with the same architecture with different initialisations? Now my code looks something like this? datagen = ImageDataGenerator(rotation_range=15, width_shift_range=2.0/28, height_shift_range=2.0/28 ) epochs = 40 lr = (1.234e-3) optimizer = Adam(lr=lr) main_input = Input(shape= (28,28,1), name='main_input') sub_models = [] for i in range(5): x = Conv2D(32, kernel_size=(3,3), strides=1)(main_input) x = BatchNormalization()(x) x = Activation('relu')(x) x = MaxPool2D(pool_size=2)(x) x = Conv2D(64, kernel_size=(3,3), strides=1)(x) x =

How to merge keras sequential models with same input?

ぐ巨炮叔叔 提交于 2019-12-05 05:16:23
I am trying to create my first ensemble models in keras. I have 3 input values and a single output value in my dataset. from keras.optimizers import SGD,Adam from keras.layers import Dense,Merge from keras.models import Sequential model1 = Sequential() model1.add(Dense(3, input_dim=3, activation='relu')) model1.add(Dense(2, activation='relu')) model1.add(Dense(2, activation='tanh')) model1.compile(loss='mse', optimizer='Adam', metrics=['accuracy']) model2 = Sequential() model2.add(Dense(3, input_dim=3, activation='linear')) model2.add(Dense(4, activation='tanh')) model2.add(Dense(3, activation