keras-layer

How to set the input of a Keras layer with a Tensorflow tensor?

旧城冷巷雨未停 提交于 2019-11-27 15:55:13
问题 In my previous question, I used Keras' Layer.set_input() to connect my Tensorflow pre-processing output tensor to my Keras model's input. However, this method has been removed after Keras version 1.1.1 . How can I achieve this in newer Keras versions? Example: # Tensorflow pre-processing raw_input = tf.placeholder(tf.string) ### some TF operations on raw_input ### tf_embedding_input = ... # pre-processing output tensor # Keras model model = Sequential() e = Embedding(max_features, 128, input

Custom connections between layers Keras

半世苍凉 提交于 2019-11-27 15:19:31
问题 I would like to manually define connections in neural network between layers using keras with Python. By default connections are beween all pairs of neurons. I need to make connections as in picture below. How can I be done in Keras? 回答1: You can use the functional API model and separate four distinct groups: from keras.models import Model from keras.layers import Dense, Input, Concatenate, Lambda inputTensor = Input((8,)) First, we can use lambda layers to split this input in four: group1 =

How to use lambda layer in keras?

限于喜欢 提交于 2019-11-27 15:16:32
问题 I want to define lambda layer to combine features with cross product, then merge those models,just like the fig. ,What should I do? Test model_1, get 128 dimensions form dense, use pywt get two 64 dimensions feature( cA,cD ), then return cA*cD //of course I want to combine two models ,but try model_1 first. from keras.models import Sequential,Model from keras.layers import Input,Convolution2D,MaxPooling2D from keras.layers.core import Dense,Dropout,Activation,Flatten,Lambda import pywt def

Keras verbose training progress bar writing a new line on each batch issue

≯℡__Kan透↙ 提交于 2019-11-27 03:13:06
问题 running a Dense feed-forward neural net in Keras. there are class_weights for two outputs, and sample_weights for a third output. fore some reason it prints the progress verbose display for each batch calculated, and not updating the print on the same line as its supposed to... Did this ever happens to you? How is it fixed? From the shell: 42336/747322 [====>.........................] - ETA: 79s - loss: 20.7154 - x1_loss: 9.5913 - x2_loss: 10.0536 - x3_loss: 1.0705 - x1_acc: 0.6930 - x2_acc:

Dimension of shape in conv1D

醉酒当歌 提交于 2019-11-27 03:01:02
I have tried to build a CNN with one layer, but I have some problem with it. Indeed, the compilator says me that ValueError: Error when checking model input: expected conv1d_1_input to have 3 dimensions, but got array with shape (569, 30) This is the code import numpy from keras.models import Sequential from keras.layers.convolutional import Conv1D numpy.random.seed(7) datasetTraining = numpy.loadtxt("CancerAdapter.csv",delimiter=",") X = datasetTraining[:,1:31] Y = datasetTraining[:,0] datasetTesting = numpy.loadtxt("CancereEvaluation.csv",delimiter=",") X_test = datasetTraining[:,1:31] Y

Tensorflow Allocation Memory: Allocation of 38535168 exceeds 10% of system memory

天大地大妈咪最大 提交于 2019-11-27 02:36:18
问题 Using ResNet50 pre-trained Weights I am trying to build a classifier. The code base is fully implemented in Keras high-level Tensorflow API. The complete code is posted in the below GitHub Link. Source Code: Classification Using RestNet50 Architecture The file size of the pre-trained model is 94.7mb . I loaded the pre-trained file new_model = Sequential() new_model.add(ResNet50(include_top=False, pooling='avg', weights=resnet_weight_paths)) and fit the model train_generator = data_generator

Using Tensorflow Layers in Keras

大兔子大兔子 提交于 2019-11-27 02:17:48
问题 I've been trying to build a sequential model in Keras using the pooling layer tf.nn.fractional_max_pool . I know I could try making my own custom layer in Keras, but I'm trying to see if I can use the layer already in Tensorflow. For the following code snippet: p_ratio=[1.0, 1.44, 1.44, 1.0] model = Sequential() model.add(ZeroPadding2D((2,2), input_shape=(1, 48, 48))) model.add(Conv2D(320, (3, 3), activation=PReLU())) model.add(ZeroPadding2D((1,1))) model.add(Conv2D(320, (3, 3), activation

Reset weights in Keras layer

心不动则不痛 提交于 2019-11-27 01:52:38
问题 I'd like to reset (randomize) the weights of all layers in my Keras (deep learning) model. The reason is that I want to be able to train the model several times with different data splits without having to do the (slow) model recompilation every time. Inspired by this discussion, I'm trying the following code: # Reset weights for layer in KModel.layers: if hasattr(layer,'init'): input_dim = layer.input_shape[1] new_weights = layer.init((input_dim, layer.output_dim),name='{}_W'.format(layer

When does keras reset an LSTM state?

僤鯓⒐⒋嵵緔 提交于 2019-11-27 00:57:54
I read all sorts of texts about it, and none seem to answer this very basic question. It's always ambiguous: In a stateful = False LSTM layer, does keras reset states after: Each sequence; or Each batch? Suppose I have X_train shaped as (1000,20,1), meaning 1000 sequences of 20 steps of a single value. If I make: model.fit(X_train, y_train, batch_size=200, nb_epoch=15) Will it reset states for every single sequence (resets states 1000 times)? Or will it reset states for every batch (resets states 5 times)? Cheking with some tests, I got to the following conclusion, which is according to the

Specify connections in NN (in keras)

北城余情 提交于 2019-11-26 23:02:30
I am using keras and tensorflow 1.4. I want to explicitly specify which neurons are connected between two layers. Therefor I have a matrix A with ones in it, whenever neuron i in the first Layer is connected to neuron j in the second Layer and zeros elsewhere. My first attempt was to create a custom layer with a kernel, that has the same size as A with non-trainable zeros in it, where A has zeros in it and trainable weights, where A has ones in it. Then, the desired output would be a simple dot-product. Unfortunately I did not manage to figure out, how to implement a kernel that is partly