How to “Merge” Sequential models in Keras 2.0?

匿名 (未验证) 提交于 2019-12-03 08:54:24

问题:

I am trying to merge two Sequential models In Keras 2.0, using the following line:

merged_model.add(Merge([model1, model2], mode='concat')) 

This still works fine, but gives a warning:

"The `Merge` layer is deprecated and will be removed after 08/2017. Use instead layers from `keras.layers.merge`, e.g. `add`, `concatenate`, etc."  

However, studying the Keras documentation and trying add, Add(), has not resulted in something that works. I have read several posts from people with the same problem, but found no solution that works in my case below. Any suggestions?

model = Sequential() model1 = Sequential() model1.add(Dense(300, input_dim=40, activation='relu', name='layer_1')) model2 = Sequential() model2.add(Dense(300, input_dim=40, activation='relu', name='layer_2')) merged_model = Sequential()  merged_model.add(Merge([model1, model2], mode='concat'))  merged_model.add(Dense(1, activation='softmax', name='output_layer')) merged_model.compile(loss='binary_crossentropy', optimizer='adam',  metrics=['accuracy'])  checkpoint = ModelCheckpoint('weights.h5', monitor='val_acc', save_best_only=True, verbose=2) early_stopping = EarlyStopping(monitor="val_loss", patience=5)  merged_model.fit([x1, x2], y=y, batch_size=384, epochs=200,              verbose=1, validation_split=0.1, shuffle=True,  callbacks=[early_stopping, checkpoint]) 

EDIT: When I tried (as suggested below by Kent Sommer):

from keras.layers.merge import concatenate merged_model.add(concatenate([model1, model2])) 

This was the error message:

Traceback (most recent call last):   File "/anaconda/lib/python3.6/site- packages/keras/engine/topology.py", line 425,  in assert_input_compatibility     K.is_keras_tensor(x)   File "/anaconda/lib/python3.6/site- packages/keras/backend/tensorflow_backend.py", line 403, in     is_keras_tensor     raise ValueError('Unexpectedly found an instance of type `' +  str(type(x)) + '`. ' ValueError: Unexpectedly found an instance of type  `<class'keras.models.Sequential'>`. Expected a symbolic tensor instance.  During handling of the above exception, another exception occurred:  Traceback (most recent call last):   File "quoradeeptest_simple1.py", line 78, in <module>     merged_model.add(concatenate([model1, model2]))   File "/anaconda/lib/python3.6/site-packages/keras/layers/merge.py",  line 600, in concatenate return Concatenate(axis=axis, **kwargs)(inputs)   File "/anaconda/lib/python3.6/site-   packages/keras/engine/topology.py",  line 558, in __call__self.assert_input_compatibility(inputs)   File "/anaconda/lib/python3.6/site-packages/keras/engine/topology.py", line 431,   in assert_input_compatibility str(inputs) + '.All inputs to the layer ' ValueError: Layer concatenate_1 was called with an input that isn't a symbolic tensor. Received type: <class 'keras.models.Sequential'>.  Full input: [<keras.models.Sequential object at 0x140fa7ba8>, <keras.models.Sequential object at 0x140fabdd8>]. All inputs to the layer should be tensors. 

回答1:

What that warning is saying is that instead of using the Merge layer with a specific mode, the different modes have now been split into their own individual layers.

So Merge(mode='concat') is now concatenate(axis=-1).

However, since you want to merge models not layers, this will not work in your case. What you will need to do is use the functional model since this behavior is no longer supported with the basic Sequential model type.

In your case that means the code should be changed to the following:

from keras.layers.merge import concatenate from keras.models import Model, Sequential from keras.layers import Dense, Input  model1_in = Input(shape=(27, 27, 1)) model1_out = Dense(300, input_dim=40, activation='relu', name='layer_1')(model1_in) model1 = Model(model1_in, model1_out)  model2_in = Input(shape=(27, 27, 1)) model2_out = Dense(300, input_dim=40, activation='relu', name='layer_2')(model2_in) model2 = Model(model2_in, model2_out)   concatenated = concatenate([model1_out, model2_out]) out = Dense(1, activation='softmax', name='output_layer')(concatenated)  merged_model = Model([model1_in, model2_in], out) merged_model.compile(loss='binary_crossentropy', optimizer='adam',  metrics=['accuracy'])  checkpoint = ModelCheckpoint('weights.h5', monitor='val_acc', save_best_only=True, verbose=2) early_stopping = EarlyStopping(monitor="val_loss", patience=5)  merged_model.fit([x1, x2], y=y, batch_size=384, epochs=200,              verbose=1, validation_split=0.1, shuffle=True,  callbacks=[early_stopping, checkpoint]) 


回答2:

Unless you have a good reason to keep the models separated, you can (and should) have the same topology in a single model. Something like:

input1 = Input(shape=(27, 27, 1)) dense1 = Dense(300, activation='relu', name='layer_1')(input1) input2 = Input(shape=(27, 27, 1)) dense2 = Dense(300, activation='relu', name='layer_2')(input2) merged = concatenate([dense1, dense2]) out = Dense(1, activation='softmax', name='output_layer')(merged) model = Model(inputs = [input1, input2], outputs = [out]) 


易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!