tensorflow-lite

tflite roadmap for LSTM

[亡魂溺海] 提交于 2019-12-24 01:55:56
问题 I read in https://www.tensorflow.org/lite/guide/roadmap : (last update from: March 6th, 2019) This item is still work in progress I assume: "Add full support of conversion for LSTMs and RNNs" What is the 2019 ETA for LSTM/RNN support, or LSTM/CNN support (basically any LSTM model for that matter..) ? even as experimental.. I am working on a highly critical (for the company/business) proof-of-concept model, which must utilize LSTM as it is, already established in the tensorflow. I am now

change the input image size for mobilenet_ssd using tensorflow

风流意气都作罢 提交于 2019-12-23 05:12:52
问题 I am using tensorflow and tflite to detect object. The model I use is mobilenet_ssd (version 2) from https://github.com/tensorflow/models/tree/master/research/object_detection the input image size for detection is fixed 300*300, which is hard-coded in the model. I want to input 1280*720 image for detection, how to do this? I do not have the traing image dataset of resolution 1280*720. I only have pascal and coco dataset. How to modify the model to accept 1280*720 image(do not scale the image)

How to import the tensorflow lite interpreter in Python?

回眸只為那壹抹淺笑 提交于 2019-12-21 12:00:09
问题 I'm developing a Tensorflow embedded application using TF lite on the Raspberry Pi 3b, running Raspbian Stretch. I've converted the graph to a flatbuffer (lite) format and have built the TFLite static library natively on the Pi. So far so good. But the application is Python and there seems to be no Python binding available. The Tensorflow Lite development guide (https://www.tensorflow.org/mobile/tflite/devguide) states "There are plans for Python bindings and a demo app." Yet there is wrapper

How to know Tensorflow Lite model's input/output feature info?

一笑奈何 提交于 2019-12-19 03:32:26
问题 I'm mobile developer. And I want to use various Tensorflow Lite models( .tflite ) with MLKit. But there are some issues, I have no idea of how to know .tflite model's input/output feature info(these will be parameters for setup). Is there any way to know that? Sorry for bad English and thanks. Update(18.06.13.): I found this site https://lutzroeder.github.io/Netron/. This visualize graph based on your uploaded model(like .mlmode or .tflite etc.) and find input/output form. Here is example

How to load a tflite model in script?

一笑奈何 提交于 2019-12-18 13:24:50
问题 I have converted the .pb file to tflite file using the bazel . Now I want to load this tflite model in my python script just to test that weather this is giving me correct output or not ? 回答1: You can use TensorFlow Lite Python interpreter to load the tflite model in a python shell, and test it with your input data. The code will be like this: import numpy as np import tensorflow as tf # Load TFLite model and allocate tensors. interpreter = tf.lite.Interpreter(model_path="converted_model

How can I test a .tflite model to prove that it behaves as the original model using the same Test Data?

与世无争的帅哥 提交于 2019-12-18 03:56:25
问题 I have generated a .tflite model based on a trained model, I would like to test that the tfilte model gives the same results as the original model. Giving both the same test data and obtaining the same result. 回答1: You may use TensorFlow Lite Python interpreter to test your tflite model. It allows you to feed input data in python shell and read the output directly like you are just using a normal tensorflow model. I have answered this question here. And you can read this TensorFlow lite

How to convert .pb to TFLite format?

为君一笑 提交于 2019-12-17 09:57:58
问题 I downloaded a retrained_graph.pb and retrained_labels.txt file of a model I trained in Azure cognitive service. Now I want to make an Android app using that model and to do so I have to convert it to TFLite format. I used toco and I am getting the following error: ValueError: Invalid tensors 'input' were found. I am basically following this tutorial and have problem on step 4 and direcly copy pasted the terminal code: https://heartbeat.fritz.ai/neural-networks-on-mobile-devices-with

convert fake quantized tensorflow model(.pb) to tensorflow lite model(.tflite) using toco failed

北战南征 提交于 2019-12-12 18:22:29
问题 I tried to follow instructions in tensorflow quantization to generate a quantized tensorflow lite model. First, I use tf.contrib.quantize.create_training_graph() and tf.contrib.quantize.create_eval_graph() in my training process to insert fake quantization node into the graph, and generate a frozen pb file(model.pb) finally. Second, I use following command to convert my fake quantized tensorflow model to quantized tensorflow lite model. bazel-bin/tensorflow/contrib/lite/toco/toco \ --input

Tensorflow Lite toco --mean_values --std_values?

女生的网名这么多〃 提交于 2019-12-12 09:19:12
问题 So I have trained a tensorflow model with fake quantization and froze it with a .pb file as output. Now I want to feed this .pb file to tensorflow lite toco for fully quantization and get the .tflite file. I am using this tensorflow example: https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/experimental/micro/examples/micro_speech The part where I have question: bazel run tensorflow/lite/toco:toco -- \ --input_file=/tmp/tiny_conv.pb --output_file=/tmp/tiny_conv.tflite \ -

Tensorflow Lite: ResNet example model gave VERY poor result during validation with ImageNet

折月煮酒 提交于 2019-12-12 01:16:27
问题 I am studying tensorflow lite. I downloaded the ResNet frozen graph ResNet_V2_101 from https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/g3doc/models.md#image-classification-float-models . And then I followed https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/tutorials/post_training_quant.ipynb to convert this frozen graph to both Lite model and quantized lite model. import tensorflow as tf import pathlib import sys import tensorflow as tf from tensorflow