Running a Tensorflow model on Android

て烟熏妆下的殇ゞ 提交于 2019-12-20 09:13:35

问题


I'm trying to figure out the workflow for training and deploying a Tensorflow model on Android. I'm aware of the other questions similar to this one on StackOverflow, but none of them seem to address the problems I've run into.

After studying the Android example from the Tensorflow repository, this is what I think the workflow should be:

  1. Build and train Tensorflow model in Python.
  2. Create a new graph, and transfer all relevant nodes (i.e. not the nodes responsible for training) to this new graph. Trained weight variables are imported as constants so that the C++ API can read them.
  3. Develop Android GUI in Java, using the native keyword to stub out a call to the Tensorflow model.
  4. Run javah to generate the C/C++ stub code for the Tensorflow native call.
  5. Fill in the stub by using the Tensorflow C++ API to read in and access the trained/serialized model.
  6. Use Bazel to build BOTH the Java app, the native Tensorflow interface (as a .so file), and generate the APK.
  7. Use adb to deploy the APK.

    Step 6 is the problem. Bazel will happily compile a native (to OSX) .dylib that I can call from Java via the JNI. Android Studio, likewise, will generate a whole bunch of XML code that makes the GUI I want. However, Bazel wants all of the java app code to be inside the 'WORKSPACE' top-level directory (in the Tensorflow repo), and Android Studio immediately links in all sorts of external libraries from the SDK to make GUIs (I know because my Bazel compile run fails when it can't find these resources). The only way I can find to force Bazel to cross-compile a .so file is by making it a dependent rule of an Android rule. Directly cross-compiling a native lib is what I'd prefer to porting my A.S. code to a Bazel project.

    How do I square this? Bazel will supposedly compile Android code, but Android Studio generates code that Bazel can't compile. All the examples from Google simply give you code from a repo without any clue as to how it was generated. As far as I know, the XML that's part of an Android Studio app is supposed to be generated, not made by hand. If it can be made by hand, how do I avoid the need for all those external libraries?

    Maybe I'm getting the workflow wrong, or there's some aspect of Bazel/Android Studio that I'm not understanding. Any help appreciated.

Thanks!

Edit:

There were several things that I ended up doing that might have contributed to the library building successfully:

  1. I upgraded to the latest Bazel.
  2. I rebuilt TensorFlow from source.
  3. I implemented the recommended Bazel BUILD file below, with a few additions (taken from the Android example):

    cc_binary(
    name = "libName.so",
    srcs = ["org_tensorflowtest_MyActivity.cc", 
            "org_tensorflowtest_MyActivity.h",
            "jni.h",
            "jni_md.h",
            ":libpthread.so"],
    deps = ["//tensorflow/core:android_tensorflow_lib",
            ],
    copts = [
        "-std=c++11",
        "-mfpu=neon",
        "-O2",
    ],
    linkopts = ["-llog -landroid -lm"],
    linkstatic = 1,
    linkshared = 1,
    )
    
    cc_binary(
         name = "libpthread.so",
         srcs = [],
         linkopts = ["-shared"],
         tags = [
             "manual",
             "notap",
         ],
    )
    

I haven't verified that this library can be loaded and used in Android yet; Android Studio 1.5 seems to be very finicky about acknowledging the presence of native libs.


回答1:


After setting up an Android NDK in your WORKSPACE file, Bazel can cross-compile a .so for Android, like this:

cc_binary(
    name = "libfoo.so",
    srcs = ["foo.cc"],
    deps = [":bar"],
    linkstatic = 1,
    linkshared = 1,
)

$ bazel build foo:libfoo.so \
    --crosstool_top=//external:android/crosstool --cpu=armeabi-v7a \
    --host_crosstool_top=@bazel_tools//tools/cpp:toolchain
$ file bazel-bin/foo/libfoo.so
bazel-bin/foo/libfoo.so: ELF 32-bit LSB  shared object, ARM, EABI5 version 1 (SYSV), dynamically linked (uses shared libs), not stripped

Bazel wants all of the java app code to be inside the 'WORKSPACE' top-level directory (in the Tensorflow repo)

When 0.1.4 is released (pushing it right now) and we have pushed some fixes to TensorFlow and Protobuf, you can start using the TensorFlow repo as a remote repository. After setting it up in your WORKSPACE file, you can then refer to TensorFlow rules using @tensorflow//foo/bar labels.




回答2:


git clone --recurse-submodules https://github.com/tensorflow/tensorflow.git

Note: --recurse-submodules is important to pull submodules.

Install Bazel from here. Bazel is the primary build system for TensorFlow. Now, edit the WORKSPACE, we can find the WORKSPACE file in the root directory of the TensorFlow that we have cloned earlier.

# Uncomment and update the paths in these entries to build the Android demo.
#android_sdk_repository(
#    name = "androidsdk",
#    api_level = 23,
#    build_tools_version = "25.0.1",
#    # Replace with path to Android SDK on your system
#    path = "<PATH_TO_SDK>",
#)
#
#android_ndk_repository(
#    name="androidndk",
#    path="<PATH_TO_NDK>",
#    api_level=14)

Like below with our sdk and ndk path:

android_sdk_repository(
    name = "androidsdk",
    api_level = 23,
    build_tools_version = "25.0.1",
    # Replace with path to Android SDK on your system
    path = "/Users/amitshekhar/Library/Android/sdk/",
)
android_ndk_repository(
    name="androidndk",
    path="/Users/amitshekhar/Downloads/android-ndk-r13/",
    api_level=14)

Then build the .so file.

bazel build -c opt //tensorflow/contrib/android:libtensorflow_inference.so \
   --crosstool_top=//external:android/crosstool \
   --host_crosstool_top=@bazel_tools//tools/cpp:toolchain \
   --cpu=armeabi-v7a

Replacing armeabi-v7a with our desired target architecture. The library will be located at:

bazel-bin/tensorflow/contrib/android/libtensorflow_inference.so

To build the Java counterpart:

bazel build //tensorflow/contrib/android:android_tensorflow_inference_java

We can find the JAR file at:

bazel-bin/tensorflow/contrib/android/libandroid_tensorflow_inference_java.jar

Now we have both jar and .so file. I have already built both .so file and jar, you can directly use from the project.

Put libandroid_tensorflow_inference_java.jar in libs folder and right click and add as library.

compile files('libs/libandroid_tensorflow_inference_java.jar')

Create jniLibs folder in main directory and put libtensorflow_inference.so in jniLibs/armeabi-v7a/ folder.

Now, we will be able to call TensorFlow Java API.

The TensorFlow Java API has exposed all the required methods through a class TensorFlowInferenceInterface.

Now, we have to call the TensorFlow Java API with the model path and load it.

I have written a complete blog here.



来源:https://stackoverflow.com/questions/34889605/running-a-tensorflow-model-on-android

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!