Running a Tensorflow model on Android

别等时光非礼了梦想. 提交于 2019-12-02 17:37:48

After setting up an Android NDK in your WORKSPACE file, Bazel can cross-compile a .so for Android, like this:

cc_binary(
    name = "libfoo.so",
    srcs = ["foo.cc"],
    deps = [":bar"],
    linkstatic = 1,
    linkshared = 1,
)

$ bazel build foo:libfoo.so \
    --crosstool_top=//external:android/crosstool --cpu=armeabi-v7a \
    --host_crosstool_top=@bazel_tools//tools/cpp:toolchain
$ file bazel-bin/foo/libfoo.so
bazel-bin/foo/libfoo.so: ELF 32-bit LSB  shared object, ARM, EABI5 version 1 (SYSV), dynamically linked (uses shared libs), not stripped

Bazel wants all of the java app code to be inside the 'WORKSPACE' top-level directory (in the Tensorflow repo)

When 0.1.4 is released (pushing it right now) and we have pushed some fixes to TensorFlow and Protobuf, you can start using the TensorFlow repo as a remote repository. After setting it up in your WORKSPACE file, you can then refer to TensorFlow rules using @tensorflow//foo/bar labels.

git clone --recurse-submodules https://github.com/tensorflow/tensorflow.git

Note: --recurse-submodules is important to pull submodules.

Install Bazel from here. Bazel is the primary build system for TensorFlow. Now, edit the WORKSPACE, we can find the WORKSPACE file in the root directory of the TensorFlow that we have cloned earlier.

# Uncomment and update the paths in these entries to build the Android demo.
#android_sdk_repository(
#    name = "androidsdk",
#    api_level = 23,
#    build_tools_version = "25.0.1",
#    # Replace with path to Android SDK on your system
#    path = "<PATH_TO_SDK>",
#)
#
#android_ndk_repository(
#    name="androidndk",
#    path="<PATH_TO_NDK>",
#    api_level=14)

Like below with our sdk and ndk path:

android_sdk_repository(
    name = "androidsdk",
    api_level = 23,
    build_tools_version = "25.0.1",
    # Replace with path to Android SDK on your system
    path = "/Users/amitshekhar/Library/Android/sdk/",
)
android_ndk_repository(
    name="androidndk",
    path="/Users/amitshekhar/Downloads/android-ndk-r13/",
    api_level=14)

Then build the .so file.

bazel build -c opt //tensorflow/contrib/android:libtensorflow_inference.so \
   --crosstool_top=//external:android/crosstool \
   --host_crosstool_top=@bazel_tools//tools/cpp:toolchain \
   --cpu=armeabi-v7a

Replacing armeabi-v7a with our desired target architecture. The library will be located at:

bazel-bin/tensorflow/contrib/android/libtensorflow_inference.so

To build the Java counterpart:

bazel build //tensorflow/contrib/android:android_tensorflow_inference_java

We can find the JAR file at:

bazel-bin/tensorflow/contrib/android/libandroid_tensorflow_inference_java.jar

Now we have both jar and .so file. I have already built both .so file and jar, you can directly use from the project.

Put libandroid_tensorflow_inference_java.jar in libs folder and right click and add as library.

compile files('libs/libandroid_tensorflow_inference_java.jar')

Create jniLibs folder in main directory and put libtensorflow_inference.so in jniLibs/armeabi-v7a/ folder.

Now, we will be able to call TensorFlow Java API.

The TensorFlow Java API has exposed all the required methods through a class TensorFlowInferenceInterface.

Now, we have to call the TensorFlow Java API with the model path and load it.

I have written a complete blog here.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!