Converting .tflite to .pb

前端 未结 3 1392
傲寒
傲寒 2020-12-19 10:08

Problem: How can i convert a .tflite (serialised flat buffer) to .pb (frozen model)? The documentation only talks about one way co

3条回答
  •  借酒劲吻你
    2020-12-19 11:01

    I don't think there is a way to restore tflite back to pb as some information are lost after conversion. I found an indirect way to have a glimpse on what is inside tflite model is to read back each of the tensor.

    interpreter = tf.contrib.lite.Interpreter(model_path=model_path)     
    interpreter.allocate_tensors()
    
    # trial some arbitrary numbers to find out the num of tensors
    num_layer = 89 
    for i in range(num_layer):
        detail = interpreter._get_tensor_details(i)
        print(i, detail['name'], detail['shape'])
    

    and you would see something like below. As there are only limited of operations that are currently supported, it is not too difficult to reverse engineer the network architecture. I have put some tutorials too on my Github

    0 MobilenetV1/Logits/AvgPool_1a/AvgPool [   1    1    1 1024]
    1 MobilenetV1/Logits/Conv2d_1c_1x1/BiasAdd [   1    1    1 1001]
    2 MobilenetV1/Logits/Conv2d_1c_1x1/Conv2D_bias [1001]
    3 MobilenetV1/Logits/Conv2d_1c_1x1/weights_quant/FakeQuantWithMinMaxVars [1001    1    1 1024]
    4 MobilenetV1/Logits/SpatialSqueeze [   1 1001]
    5 MobilenetV1/Logits/SpatialSqueeze_shape [2]
    6 MobilenetV1/MobilenetV1/Conv2d_0/Conv2D_Fold_bias [32]
    7 MobilenetV1/MobilenetV1/Conv2d_0/Relu6 [  1 112 112  32]
    8 MobilenetV1/MobilenetV1/Conv2d_0/weights_quant/FakeQuantWithMinMaxVars [32  3  3  3]
    9 MobilenetV1/MobilenetV1/Conv2d_10_depthwise/Relu6 [  1  14  14 512]
    10 MobilenetV1/MobilenetV1/Conv2d_10_depthwise/depthwise_Fold_bias [512]
    11 MobilenetV1/MobilenetV1/Conv2d_10_depthwise/weights_quant/FakeQuantWithMinMaxVars [  1   3   3 512]
    12 MobilenetV1/MobilenetV1/Conv2d_10_pointwise/Conv2D_Fold_bias [512]
    13 MobilenetV1/MobilenetV1/Conv2d_10_pointwise/Relu6 [  1  14  14 512]
    14 MobilenetV1/MobilenetV1/Conv2d_10_pointwise/weights_quant/FakeQuantWithMinMaxVars [512   1   1 512]
    15 MobilenetV1/MobilenetV1/Conv2d_11_depthwise/Relu6 [  1  14  14 512]
    16 MobilenetV1/MobilenetV1/Conv2d_11_depthwise/depthwise_Fold_bias [512]
    17 MobilenetV1/MobilenetV1/Conv2d_11_depthwise/weights_quant/FakeQuantWithMinMaxVars [  1   3   3 512]
    18 MobilenetV1/MobilenetV1/Conv2d_11_pointwise/Conv2D_Fold_bias [512]
    19 MobilenetV1/MobilenetV1/Conv2d_11_pointwise/Relu6 [  1  14  14 512]
    20 MobilenetV1/MobilenetV1/Conv2d_11_pointwise/weights_quant/FakeQuantWithMinMaxVars [512   1   1 512]
    

提交回复
热议问题