问题
I would like to get intermediate layers output of a TFLite graph. Something in the lines of below.
Visualize TFLite graph and get intermediate values of a particular node?
The above solution works on frozen graphs only. Since SavedModel is the preferred way of serializing the model in TF 2.0, I would like to have a solution with a saved model. I tried to pass --output_arrays for "toco" with savedModelDir as input. This is not helping.
From the documentation, it looks like SignatureDefs in SavedModel is the option to achieve this. But, I could not get it working.
x = test_images[0:1]
output = model.predict(x, batch_size=1)
signature_def = signature_def_utils.build_signature_def(
inputs={name:"x:0", dtype: DT_FLOAT, tensor_shape: (1, 28,28, 1)})
outputs = {{name: "output:0", dtype: DT_FLOAT, tensor_shape: (1, 10)},
{name:"Dense_1:0", dtype: DT_FLOAT, tensor_shape: (1, 10)}})
tf.saved_model.save(model, './tf-saved-model-sigdefs', signature_def)
Can you share an example usage of SignatureDefs for this purpose? BTW, I have been playing around with the below tutorial for this experiment. https://www.tensorflow.org/beta/tutorials/images/intro_to_cnns
回答1:
Below is the best solution that I have so far.
from __future__ import absolute_import, division, print_function
#!pip install -q tensorflow==2.0.0-alpha0
import tensorflow as tf
from tensorflow.keras import datasets, layers, models
(train_images, train_labels), (test_images, test_labels) = datasets.mnist.load_data()
train_images = train_images.reshape((60000, 28, 28, 1))
test_images = test_images.reshape((10000, 28, 28, 1))
# Normalize pixel values to be between 0 and 1
train_images, test_images = train_images / 255.0, test_images / 255.0
model = models.Sequential()
model.add(layers.Conv2D(32, (3, 3), activation='relu', input_shape=(28, 28, 1)))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.Flatten())
model.add(layers.Dense(64, activation='relu'))
model.add(layers.Dense(10, activation='softmax'))
model.summary()
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
model.fit(train_images, train_labels, epochs=5)
test_loss, test_acc = model.evaluate(test_images, test_labels)
#Saving in saved_model format
tf.keras.experimental.export_saved_model(model, './tf-saved-model')
#Saving in TFLIte FlatBUffer format
tflite_mnist_model = "mnist_model.tflite"
converter = tf.lite.TFLiteConverter.from_keras_model(model)
tflite_model = converter.convert()
open(tflite_mnist_model, "wb").write(tflite_model)
#cloning Keras model with debug outputs
outputs = [model.get_layer("dense").output, model.get_layer("dense_1").output]
model_debug = tf.keras.Model(inputs=model.inputs, outputs=outputs)
#Saving in TFLIte FlatBUffer format with debug information
tflite_mnist_model = "mnist_model_debug.tflite"
converter = tf.lite.TFLiteConverter.from_keras_model(model_debug)
tflite_model = converter.convert()
open(tflite_mnist_model, "wb").write(tflite_model)
来源:https://stackoverflow.com/questions/57139676/savedmodel-tflite-signaturedef-tensorinfo-get-intermediate-layer-outputs