Cannot convert between a TensorFlowLite buffer with 307200 bytes and a Java Buffer with 270000 bytes

旧城冷巷雨未停 提交于 2020-06-16 19:19:07

问题


I am trying to run a pre-trained Object Detection TensorFlowLite model from Tensorflow detection model zoo. I used the ssd_mobilenet_v3_small_coco model from this site under the Mobile Models heading. According to the instructions under Running our model on Android, I commented out the model download script to avoid the assets being overwritten: // apply from:'download_model.gradle' in build.gradle file and replaced the detect.tflite and labelmap.txt file in assets directory. Build was successful without any errors and the app was installed in my android device but it crashed as soon as it launched and the logcat showed:

E/AndroidRuntime: FATAL EXCEPTION: inference
Process: org.tensorflow.lite.examples.detection, PID: 16960
java.lang.IllegalArgumentException: Cannot convert between a TensorFlowLite buffer with 307200 bytes and a Java Buffer with 270000 bytes.
    at org.tensorflow.lite.Tensor.throwIfShapeIsIncompatible(Tensor.java:425)
    at org.tensorflow.lite.Tensor.throwIfDataIsIncompatible(Tensor.java:392)
    at org.tensorflow.lite.Tensor.setTo(Tensor.java:188)
    at org.tensorflow.lite.NativeInterpreterWrapper.run(NativeInterpreterWrapper.java:150)
    at org.tensorflow.lite.Interpreter.runForMultipleInputsOutputs(Interpreter.java:314)
    at org.tensorflow.lite.examples.detection.tflite.TFLiteObjectDetectionAPIModel.recognizeImage(TFLiteObjectDetectionAPIModel.java:196)
    at org.tensorflow.lite.examples.detection.DetectorActivity$2.run(DetectorActivity.java:185)
    at android.os.Handler.handleCallback(Handler.java:873)
    at android.os.Handler.dispatchMessage(Handler.java:99)
    at android.os.Looper.loop(Looper.java:201)
    at android.os.HandlerThread.run(HandlerThread.java:65)

I have searched through many TensorFlowLite documentations but did not find anything related to this error and I found some questions on stackoverflow having same error message but for a custom trained model, so that did not help. The same error keeps on coming even on a custom trained model. What should I do to eliminate this error?


回答1:


You should resize your input tensors, so your model can take data of any size, pixels or batches.

The below code is for image classification and yours is object detection: TFLiteObjectDetectionAPIModel is responsible to get size. Try to manipulate the size in some where TFLiteObjectDetectionAPIModel.

The labels length needs to be match the output tensor length for your trained model.

  int[] dimensions = new int[4];
  dimensions[0] = 1; // Batch_size // No of frames at a time
  dimensions[1] = 224; // Image Width required by model
  dimensions[2] = 224; // Image Height required by model
  dimensions[3] = 3; // No of Pixels
  Tensor tensor = c.tfLite.getInputTensor(0);
  c.tfLite.resizeInput(0, dimensions);
  Tensor tensor1 = c.tfLite.getInputTensor(0);

Change input size



来源:https://stackoverflow.com/questions/61984286/cannot-convert-between-a-tensorflowlite-buffer-with-307200-bytes-and-a-java-buff

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!