问题
I worked on eye region localisation project and I trained my own custom dataset to create a model using Tensorflow library.
I produce .ckpts files ( model ), I get acceptable results, I convert this model to .pb frozen inference model and I test the accuracy of the frozen model on my webcam and it works fine.
The problem is when I convert .pb model to tflite model. I get so bad results using an Android application and MLkit firebase custom model.
I have posted this issue on GitHub ( Tensorflow repo and firebase repo ) Github link but I didn't get any answer, I really need to solve this problem.
- Result using frozen inference( python script ):
- Result using tflite model ( Android device )
That was some java code ( Android ):
private void useInferenceResult(float[] probabilities) throws IOException {
// [START mlkit_use_inference_result]
String[] result=new String[80];
String x="";
String y="";
ArrayList<Point> listpoint= new ArrayList<Point>();
double viewWidth = canvas.getWidth();
double viewHeight = canvas.getHeight();
double imageWidth = mutableBitmap.getWidth();
double imageHeight = mutableBitmap.getHeight();
Log.i("viewWidth","viewwidth "+viewWidth);
Log.i("viewHeight","viewheight "+viewHeight);
Log.i("imagewidth","imagewidth "+imageWidth);
Log.i("imaageHeigh","imageheigh "+imageHeight);
double scale = Math.min(viewWidth / imageWidth, viewHeight / imageHeight);
Log.i("Scale","Scale"+scale);
try {
for (int i = 0; i < probabilities.length; i++) {
Log.i("MLKit", String.format("%1.8f", probabilities[i]));
float i1 = probabilities[i];
Log.i("floaaat", "" + i1);
x = String.format("%1.8f", probabilities[i]);
y = String.format("%1.8f", probabilities[i + 1]);
Point p = new Point(x, y);
i = i + 1;
p.setX(x);
p.setY(y);
listpoint.add(p);
}
}
catch(Exception exc){
Log.e("Exception","Error: "+exc);
}
for(int j=0;j<listpoint.size();j++){
try {
String xx = listpoint.get(j).getX();
String yy = listpoint.get(j).getY();
xx=xx.replace(",",".");
yy=yy.replace(",",".");
float xx1 = Float.parseFloat(xx);
float yy1 = Float.parseFloat(yy);
Log.i("Float results", "point_" + j + "(" + xx1 + ", " + yy1 + ")");
Log.i("Scale","Scale "+scale);
drawpoint(image2, (xx1*(float)scale*293) , (yy1*(float)scale*293) , 1);
}
catch(Exception esa){
Log.e("Exception","Exception: "+esa);
Toast.makeText(this, "Exception"+esa, Toast.LENGTH_SHORT).show();
}
}
}
// drawbitmap function
private double drawBitmap(Canvas canvas) {
double viewWidth = canvas.getWidth();
double viewHeight = canvas.getHeight();
double imageWidth = mutableBitmap.getWidth();
double imageHeight = mutableBitmap.getHeight();
double scale = Math.min(viewWidth / imageWidth, viewHeight / imageHeight);
Rect destBounds = new Rect(0, 0, (int)(imageWidth * scale), (int)(imageHeight * scale));
canvas.drawBitmap(mutableBitmap, null, destBounds, null);
return scale;
}
How can I solve this error?
来源:https://stackoverflow.com/questions/57442372/tflite-prediction-is-totally-different-than-frozen-inference-graph-prediction