AttributeError: 'NoneType' object has no attribute '_inbound_nodes' while trying to add multiple keras Dense layers

匿名 (未验证) 提交于 2019-12-03 01:27:01

问题:

The input are 3 independent channels of 1000 features. I'm trying to pass each channel through a independent NN path, then concatenate them into a flat layer. Then apply a FCN on the flatten layer for a binary classification. I'm trying to add multiple Dense layers together, like this:

def tst_1():

inputs = Input((3, 1000, 1))  dense10 = Dense(224, activation='relu')(inputs[0,:,1]) dense11 = Dense(112, activation='relu')(dense10) dense12 = Dense(56, activation='relu')(dense11)  dense20 = Dense(224, activation='relu')(inputs[1,:,1]) dense21 = Dense(112, activation='relu')(dense20) dense22 = Dense(56, activation='relu')(dense21)  dense30 = Dense(224, activation='relu')(inputs[2,:,1]) dense31 = Dense(112, activation='relu')(dense30) dense32 = Dense(56, activation='relu')(dense31)  flat = keras.layers.Add()([dense12, dense22, dense32])  dense1 = Dense(224, activation='relu')(flat) drop1 = Dropout(0.5)(dense1) dense2 = Dense(112, activation='relu')(drop1) drop2 = Dropout(0.5)(dense2) dense3 = Dense(32, activation='relu')(drop2) densef = Dense(1, activation='sigmoid')(dense3)  model = Model(inputs = inputs, outputs = densef)  model.compile(optimizer=Adam(), loss='binary_crossentropy', metrics=['accuracy'])  return model model = tst_1()  model.summary() 

but I got this error:

/usr/local/lib/python2.7/dist-packages/keras/engine/network.pyc in build_map(tensor, finished_nodes, nodes_in_progress, layer, node_index, tensor_index) 1310 ValueError: if a cycle is detected. 1311 """ -> 1312 node = layer._inbound_nodes[node_index] 1313 1314 # Prevent cycles.

AttributeError: 'NoneType' object has no attribute '_inbound_nodes'

回答1:

The problem is that splitting the input data using inputs[0,:,1] is not done as a keras layer.

You need to create a Lambda layer to be able to accomplish this.

The following code:

from keras import layers from keras.layers import Input, Add, Dense,Dropout, Lambda, Concatenate from keras.layers import Flatten from keras.optimizers import Adam from keras.models import Model import keras.backend as K   def tst_1():       num_channels = 3     inputs = Input(shape=(num_channels, 1000, 1))      branch_outputs = []     for i in range(num_channels):         # Slicing the ith channel:         out = Lambda(lambda x: x[:, i, :, :], name = "Lambda_" + str(i))(inputs)          # Setting up your per-channel layers (replace with actual sub-models):         out = Dense(224, activation='relu', name = "Dense_224_" + str(i))(out)         out = Dense(112, activation='relu', name = "Dense_112_" + str(i))(out)         out = Dense(56, activation='relu', name = "Dense_56_" + str(i))(out)         branch_outputs.append(out)      # Concatenating together the per-channel results:     out = Concatenate()(branch_outputs)       dense1 = Dense(224, activation='relu')(out)     drop1 = Dropout(0.5)(dense1)     dense2 = Dense(112, activation='relu')(drop1)     drop2 = Dropout(0.5)(dense2)     dense3 = Dense(32, activation='relu')(drop2)     densef = Dense(1, activation='sigmoid')(dense3)      model = Model(inputs = inputs, outputs = densef)      return model  Net = tst_1() Net.compile(optimizer=Adam(), loss='binary_crossentropy', metrics=['accuracy'])  Net.summary() 

correctly created the net that you want.



回答2:

Thanks to @CAta.RAy

I solved it in this way:

import numpy as np from keras import layers from keras.layers import Input, Add, Dense,Dropout, Lambda from keras.layers import Flatten from keras.optimizers import Adam from keras.models import Model import keras.backend as K     def tst_1():      inputs = Input((3, 1000))      x1 = Lambda(lambda x:x[:,0])(inputs)     dense10 = Dense(224, activation='relu')(x1)     dense11 = Dense(112, activation='relu')(dense10)     dense12 = Dense(56, activation='relu')(dense11)      x2 = Lambda(lambda x:x[:,1])(inputs)     dense20 = Dense(224, activation='relu')(x2)     dense21 = Dense(112, activation='relu')(dense20)     dense22 = Dense(56, activation='relu')(dense21)      x3 = Lambda(lambda x:x[:,2])(inputs)     dense30 = Dense(224, activation='relu')(x3)     dense31 = Dense(112, activation='relu')(dense30)     dense32 = Dense(56, activation='relu')(dense31)      flat = Add()([dense12, dense22, dense32])      dense1 = Dense(224, activation='relu')(flat)     drop1 = Dropout(0.5)(dense1)     dense2 = Dense(112, activation='relu')(drop1)     drop2 = Dropout(0.5)(dense2)     dense3 = Dense(32, activation='relu')(drop2)     densef = Dense(1, activation='sigmoid')(dense3)      model = Model(inputs = inputs, outputs = densef)      return model  Net = tst_1() Net.compile(optimizer=Adam(), loss='binary_crossentropy', metrics=['accuracy'])  Net.summary() 


标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!