keras-layer

Model with BatchNormalization: stagnant test loss

被刻印的时光 ゝ 提交于 2019-12-23 01:44:10
问题 I wrote a neural network using Keras. It contains BatchNormalization layers. When I trained it with model.fit , everything was fine. When training it with tensorflow as explained here, the training is fine, but the validation step always give very poor performance, and it quickly saturates (the accuracy goes 5%, 10%, 40%, 40%, 40%..; the loss is stagnant too). I need to use tensorflow because it allows more flexibility regarding the monitoring part of training. I strongly suspect it has

Model with BatchNormalization: stagnant test loss

三世轮回 提交于 2019-12-23 01:44:03
问题 I wrote a neural network using Keras. It contains BatchNormalization layers. When I trained it with model.fit , everything was fine. When training it with tensorflow as explained here, the training is fine, but the validation step always give very poor performance, and it quickly saturates (the accuracy goes 5%, 10%, 40%, 40%, 40%..; the loss is stagnant too). I need to use tensorflow because it allows more flexibility regarding the monitoring part of training. I strongly suspect it has

TypeError: __init__() got an unexpected keyword argument 'trainable'

旧时模样 提交于 2019-12-22 18:32:10
问题 I am trying to load a RNN model architecture trained in Keras using keras.models.model_from_json and I am getting the mentioned error with open('model_architecture.json', 'r') as f: model = model_from_json(f.read(), custom_objects={'AttLayer':AttLayer}) # Load weights into the new model model.load_weights('model_weights.h5') Here is the custom layer I am using class AttLayer(Layer): def __init__(self, attention_dim): self.init = initializers.get('normal') self.supports_masking = True self

ValueError: Tensor:(…) is not an element of this graph

独自空忆成欢 提交于 2019-12-22 14:16:36
问题 I am using keras' pre-trained model and the error came up when trying to get predictions. I have the following code in flask server: from NeuralNetwork import * @app.route("/uploadMultipleImages", methods=["POST"]) def uploadMultipleImages(): uploaded_files = request.files.getlist("file[]") getPredictionfunction = preTrainedModel["VGG16"] for file in uploaded_files: path = os.path.join(STATIC_PATH, file.filename) result = getPredictionfunction(path) This is what I have in my NeuralNetwork.py

How to wrap a tensorflow object as Keras layer?

我是研究僧i 提交于 2019-12-22 14:01:32
问题 I would like to implement Hierarchical Multiscale LSTM as a Keras layer. It was published here and implemented in tensorflow here. My understanding is that there's a way to wrap such a tensorflow object in Keras as a layer. I'm not sure how complicated it is but I think it's feasible. Can you help me how to do it? 回答1: This is usually done by implementing a custom Layer. To be more specific, you should inherit from keras.engine.topology.layer and provide a custom implementation for the

Copying weights of a specific layer - keras

☆樱花仙子☆ 提交于 2019-12-22 10:39:20
问题 According to this the following copies weights from one model to another: target_model.set_weights(model.get_weights()) What about copying the weights of a specific layer, would this work? model_1.layers[0].set_weights(source_model.layers[0].get_weights()) model_2.layers[0].set_weights(source_model.layers[0].get_weights()) If I train model_1 and model_2 will they have separate weights? The documentation doesn't state whether if this get_weights makes a deep copy or not. If this doesn't work,

How to merge multiple sequential models in Keras Python?

烈酒焚心 提交于 2019-12-22 07:21:20
问题 I'm building a model with multiple sequential models that I need to merge before training the dataset. It seems keras.engine.topology.Merge isn't supported on Keras 2.0 anymore. I tried keras.layers.Add and keras.layers.Concatenate and it doesn't work as well. Here's my code: model = Sequential() model1 = Sequential() model1.add(Embedding(len(word_index) + 1, 300, weights = [embedding_matrix], input_length = 40, trainable = False)) model1.add(TimeDistributed(Dense(300, activation = 'relu')))

skipping layer in backpropagation in keras

别来无恙 提交于 2019-12-21 16:33:46
问题 I am using Keras with tensorflow backend and I am curious whether it is possible to skip a layer during backpropagation but have it execute in the forward pass. So here is what I mean Lambda (lambda x: a(x)) I want to apply a to x in the forward pass but I do not want a to be included in the derivation when the backprop takes place. I was trying to find a solution bit I could not find anything. Can somebody help me out here? 回答1: UPDATE 2 In addition to tf.py_func, there is now an official

How to use keras ImageDataGenerator with a Siamese or Tripple networks

丶灬走出姿态 提交于 2019-12-21 04:32:03
问题 I'm trying to build up both a Siamese neural network and triple neural network on a custom large dataset Keras has ImageDataGenerator which makes the generation of input data to a regular neural network very easy. I'm interesting to use ImageDataGenerator or similar ways in order to train a networks with 2( siamese ) and 3( triple ) inputs. In mniset keras siamese example, The input generated by a pre-process stage which is done by create_pairs method. I don't think this kind of way fit for a

How do I get the weights of a layer in Keras?

孤街浪徒 提交于 2019-12-20 12:07:11
问题 I am using Windows 10, Python 3.5, and tensorflow 1.1.0. I have the following script: import tensorflow as tf import tensorflow.contrib.keras.api.keras.backend as K from tensorflow.contrib.keras.api.keras.layers import Dense tf.reset_default_graph() init = tf.global_variables_initializer() sess = tf.Session() K.set_session(sess) # Keras will use this sesssion to initialize all variables input_x = tf.placeholder(tf.float32, [None, 10], name='input_x') dense1 = Dense(10, activation='relu')