How to concatenate two layers in keras?

匿名 (未验证) 提交于 2019-12-03 02:14:01

问题:

I have an example of a neural network with two layers. The first layer takes two arguments and has one output. The second should take one argument as result of the first layer and one additional argument. It should looks like this:

x1  x2  x3  \  /   /   y1   /    \  /     y2 

So, I'd created a model with two layers and tried to merge them but it returns an error: The first layer in a Sequential model must get an "input_shape" or "batch_input_shape" argument. on the line result.add(merged).

Model:

first = Sequential() first.add(Dense(1, input_shape=(2,), activation='sigmoid'))  second = Sequential() second.add(Dense(1, input_shape=(1,), activation='sigmoid'))  result = Sequential() merged = Concatenate([first, second]) ada_grad = Adagrad(lr=0.1, epsilon=1e-08, decay=0.0) result.add(merged) result.compile(optimizer=ada_grad, loss=_loss_tensor, metrics=['accuracy']) 

回答1:

Well you're getting the error because result defined as Sequential() is just a container for the a model and you have not defined an input for it.

Given what you're trying to build set result to take the third input x3.

first = Sequential() first.add(Dense(1, input_shape=(2,), activation='sigmoid'))  second = Sequential() second.add(Dense(1, input_shape=(1,), activation='sigmoid'))  third = Sequential() # of course you must provide the input to result with will be your x3 third.add(Dense(1, input_shape=(1,), activation='sigmoid'))  # lets say you add a few more layers to first and second. # concatenate them merged = Concatenate([first, second])  # then concatenate the two outputs  result = Concatenate([merged,  third])  ada_grad = Adagrad(lr=0.1, epsilon=1e-08, decay=0.0)  result.compile(optimizer=ada_grad, loss='binary_crossentropy',                metrics=['accuracy']) 

However, my preferred way of building model that has this type of input structure would be to use the functional api.

Here is an implementation of your requirements to get you started:

from keras.models import Sequential, Model from keras.layers import Concatenate, Dense, LSTM, Input, concatenate from keras.optimizers import Adagrad  first_input = Input(shape=(2, )) first_dense = Dense(1, )(first_input)  second_input = Input(shape=(2, )) second_dense = Dense(1, )(second_input)  merge_one = concatenate([first_dense, second_dense])  third_input = Input(shape=(1, )) merge_two = concatenate([merge_one, third_input])  model = Model(inputs=[first_input, second_input, third_input], outputs=merge_two) model.compile(optimizer=ada_grad, loss='binary_crossentropy',                metrics=['accuracy']) 

To answer the question in the comments:

1) How are result and merged connected? Assuming you mean how are they concatenated.

Concatenation works like this:

  a        b         c a b c   g h i    a b c g h i d e f   j k l    d e f j k l 

i.e rows are just joined.

2) Now, x1 is input to first, x2 is input into second and x3 input into third.



回答2:

You can experiment with model.summary() (notice the concatenate_XX (Concatenate) layer size)

# merge samples, two input must be same shape inp1 = Input(shape=(10,32)) inp2 = Input(shape=(10,32)) cc1 = concatenate([inp1, inp2],axis=0) # Merge data must same row column output = Dense(30, activation='relu')(cc1) model = Model(inputs=[inp1, inp2], outputs=output) model.summary()  # merge row must same column size inp1 = Input(shape=(20,10)) inp2 = Input(shape=(32,10)) cc1 = concatenate([inp1, inp2],axis=1) output = Dense(30, activation='relu')(cc1) model = Model(inputs=[inp1, inp2], outputs=output) model.summary()  # merge column must same row size inp1 = Input(shape=(10,20)) inp2 = Input(shape=(10,32)) cc1 = concatenate([inp1, inp2],axis=1) output = Dense(30, activation='relu')(cc1) model = Model(inputs=[inp1, inp2], outputs=output) model.summary() 

You can view notebook here for detail: https://nbviewer.jupyter.org/github/anhhh11/DeepLearning/blob/master/Concanate_two_layer_keras.ipynb



易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!