tensorflow-lite

How to know which tensor to choose from the list of tensor names in Graph?

冷暖自知 提交于 2019-12-11 04:14:07
问题 I am trying to export a Linear Classifier to a tflite format. That's why I need to select a tensor from the tensor list names in Graph. For example for exporting a DNN classifier model the following input and output tensor were chosen: input_tensor = sess.graph.get_tensor_by_name("dnn/input_from_feature_columns/input_layer/concat:0") input_tensor.set_shape([1, 4]) out_tensor = sess.graph.get_tensor_by_name("dnn/logits/BiasAdd:0") out_tensor.set_shape([1, 3]) but for a Linear Classifier I don

Tensorflow - h5 model to tflite conversion error

泄露秘密 提交于 2019-12-11 03:12:28
问题 I've made a learning transfer using a pre-trained InceptionV3 model, and I saved the h5 model file. After that, I am able to make predictions. Now, I want to convert the h5 model to tflite file, using TFLiteConverter.convert() method, like this: converter = lite.TFLiteConverter.from_keras_model_file('keras.model.h5') tflite_model = converter.convert() but I get this error: File "from_saved_model.py", line 28, in <module> tflite_model = converter.convert() File "C:\Anaconda3\lib\site-packages

Error when I convert the frozen_pb file to tflite file using toco

本秂侑毒 提交于 2019-12-10 00:08:33
问题 I use the mobileNet pre-trained model for object-detection.I have owned frozen_graph file, and I use tool to know the input_arrays and output_arrays.This is my command: bazel-bin/tensorflow/contrib/lite/toco/toco \ --input_file=$(pwd)/mobilenet_v1_1.0_224/frozen_graph.pb \ --input_format=TENSORFLOW_GRAPHDEF --output_format=TFLITE \ --output_file=/tmp/mobilenet_v1_1.0_224.tflite --inference_type=FLOAT \ --input_type=FLOAT --input_arrays=image_tensor \ --output_arrays=detection_boxes,detection

Error converting Facenet model .pb file to TFLITE format

孤者浪人 提交于 2019-12-09 23:57:54
问题 i'm trying to convert a pre-trained frozen .pb based on Inception ResNet i got from David Sandbergs Github with the Tensorflow Lite Converter on Ubuntu using the following command: /home/nils/.local/bin/tflite_convert --output_file=/home/nils/Documents/frozen.tflite --graph_def_file=/home/nils/Documents/20180402-114759/20180402-114759.pb --input_arrays=input --output_arrays=embeddings --input_shapes=1,160,160,3 However, i get the following error: 2018-12-03 15:03:16.807431: I tensorflow/core

Low Accuracy with static image on TFLite demo model

♀尐吖头ヾ 提交于 2019-12-08 03:34:47
问题 I'm trying the TFLite implementation for Image Classification using Mobile Net Transfer Learning example from TensorFlow for Poets 2 I'm able to succesfully complete the transfer learning using the four flower samples in the code lab and got the below screen This is a continuous stream of images that's being classified. I need to classify the image after taking the picture instead of stream and then take some action based on the result. Below is my approach for this. Create a basic camera app

Converting .tflite to .pb

折月煮酒 提交于 2019-12-08 03:16:45
问题 Problem : How can i convert a .tflite (serialised flat buffer) to .pb (frozen model)? The documentation only talks about one way conversion. Use-case is : I have a model that is trained on converted to .tflite but unfortunately, i do not have details of the model and i would like to inspect the graph, how can i do that? 回答1: I don't think there is a way to restore tflite back to pb as some information are lost after conversion. I found an indirect way to have a glimpse on what is inside

Tensorflow lite model is giving wrong output

好久不见. 提交于 2019-12-07 22:19:22
问题 I am developing a deep learning model with regression predicts. I created a tflite model but its predictions are different from original model and they are fully wrong.. Here is my process: I trained my model with keras model = Sequential() model.add(Dense(100, input_dim=x.shape[1], activation='relu')) # Hidden 1 model.add(Dense(50, activation='relu')) # Hidden 2 model.add(Dense(1)) # Output model.compile(loss='mean_squared_error', optimizer='adam') model.fit(x,y,verbose=0,epochs=500) And

tensorflow-lite - using tflite Interpreter to get an image in the output

折月煮酒 提交于 2019-12-06 15:50:08
问题 I am trying to use the workflow of Tensorflow-for-poets-2 TFLite tutorial, https://codelabs.developers.google.com/codelabs/tensorflow-for-poets-2-tflite/#6 But, instead of image classification, I am trying to do style transfer. It means that the input and the output of my network are images (compared to the original example, where the input is an image and the output is a list of scores). One of my many problems is to get the output-processed image from the tflite inference: After i loaded

Low Accuracy with static image on TFLite demo model

烂漫一生 提交于 2019-12-06 15:32:41
I'm trying the TFLite implementation for Image Classification using Mobile Net Transfer Learning example from TensorFlow for Poets 2 I'm able to succesfully complete the transfer learning using the four flower samples in the code lab and got the below screen This is a continuous stream of images that's being classified. I need to classify the image after taking the picture instead of stream and then take some action based on the result. Below is my approach for this. Create a basic camera app Take a picture and save it to storage The uri of the image is saved and then a drawable is created

Tensorflow lite model is giving wrong output

岁酱吖の 提交于 2019-12-06 14:32:09
I am developing a deep learning model with regression predicts. I created a tflite model but its predictions are different from original model and they are fully wrong.. Here is my process: I trained my model with keras model = Sequential() model.add(Dense(100, input_dim=x.shape[1], activation='relu')) # Hidden 1 model.add(Dense(50, activation='relu')) # Hidden 2 model.add(Dense(1)) # Output model.compile(loss='mean_squared_error', optimizer='adam') model.fit(x,y,verbose=0,epochs=500) And saved my model as h5 file model.save("keras_model.h5") Then converted h5 file to tflile format by