问题
Following what is written here I was trying to get the computed gradient during the training using tf.keras, I've end up with the following callback function which is called during the fitting's phase:
The used networks is a very standard one, fully connected and sequential.
r = network.fit(x=trn.X,y=trn.Y,verbose=2,batch_size=50,epochs=50,callbacks=[reporter,])
def on_train_begin(self, logs={}):
# Functions return weights of each layer
self.layerweights = []
for lndx, l in enumerate(self.model.layers):
if hasattr(l, 'kernel'):
self.layerweights.append(l.kernel)
input_tensors = [self.model.inputs[0],
self.model.sample_weights[0],
self.model.targets[0],
K.learning_phase()]
# Get gradients of all the relevant layers at once
grads = self.model.optimizer.get_gradients(self.model.total_loss, self.layerweights)
self.get_gradients = K.function(inputs=input_tensors,outputs=grads) # <-- Error Here
which rise the following Error Message:
~\Anaconda3\envs\tensorflow\lib\site-packages\tensorflow\python\eager\lift_to_graph.py in (.0)
312 # Check that the initializer does not depend on any placeholders.
313 sources = set(sources or [])
--> 314 visited_ops = set([x.op for x in sources])
315 op_outputs = collections.defaultdict(set)
316
AttributeError: 'NoneType' object has no attribute 'op'
Any idea how to resolve it? Already read this one, and this one, but got no luck
回答1:
AttributeError: 'NoneType' object has no attribute 'op'
means that you have a objects or attributes got None.
To handle it you can use this:
visited_ops = set([x.op for x in sources if x])
回答2:
Resolved the issue using an older version of keras(v. 2.2.4) and tensorflow (1.13.1) on python 3.6.9.
来源:https://stackoverflow.com/questions/59313711/tf-keras-get-computed-gradient-during-training