Problems with using tensorflow lite C++ API in Android Studio Project

后端 未结 1 1218
轮回少年
轮回少年 2020-12-09 00:35

I am currently working on a project regarding neural networks. For this, I want to build an Android Application which should use tensorflow [lite] to solve some object detec

相关标签:
1条回答
  • 2020-12-09 01:12

    I just remembered I asked this question a few weeks ago. Meanwhile, I found a solution to the problem and TensorflowLite is now nicely embedded into my Android Project, where I do all the programming using the C++ API!

    The problem was that the Tensorflow shared library I built did not contain a soname. So, during build process, the library was stripped and as no name was found, the path was used as the "name". I noticed that while I further investigated my native-lib.so (the NDK C++ library which is then loaded by the App) using linux "strings" tool. Here I found out that indeed the path to load the library from "/home/User/tensorflowtest/app/src/main/cpp/../../../../distribution/tensorflow/lib/x86/libtensorflowLite.so" was set. Adding a "-Wl,-soname=libtensorflowLite.so" to the build options in the BUILD file fixed this issue! You can find the whole rule I used below.

    As it was a pain to get everything set up due to the lack of explanations (it seems TensorflowLite is mostly used via Java API on Android ?), I want to give a short guidance on how use the C++ API of TensorflowLite in Android Studio (from within an Android NDK project).

    1. Build the library for your architecture

    To use the C++ API, you first need to build the TensorflowLite library. For this, add the following rule to the BUILD file in tensorflow/contrib/lite:

    cc_binary(

    name = "libtensorflowLite.so",
    linkopts=[
        "-shared", 
        "-Wl,-soname=libtensorflowLite.so",
    ],
    linkshared = 1,
    copts = tflite_copts(),
    deps = [
        ":framework",
        "//tensorflow/contrib/lite/kernels:builtin_ops",
    ],
    

    )

    Note: With this, a shared library can be built! A static one might also work.

    Now you can build the library using

    bazel build //tensorflow/contrib/lite:libtensorflowLite.so --crosstool_top=//external:android/crosstool --cpu=arm64-v8a --host_crosstool_top=@bazel_tools//tools/cpp:toolchain --cxxopt="-std=c++11"
    

    If you want to support multiple architectures, you will have to build the library several times and change the --cpu flag correspondingly.

    NOTE: This works fine at least for arm64-v8a and the armeabi-v7a (haven't tested it with MIPS so this might work aswell). However on an x86 device, I get the "atomic_store_8" error already adressed in this topic: https://github.com/tensorflow/tensorflow/issues/16589

    2. Add the library and the needed headers to be included in your Android Studio project

    Having built the library, you now need to make sure it also is linked into your Application (more specifically: Into your Android NDK library, which in my case is named "native-lib"). I will give a short overview on how to do this, however if you need a more detailed explanation you may refer to the github link I provided in my initial question: https://github.com/googlesamples/android-ndk/tree/840858984e1bb8a7fab37c1b7c571efbe7d6eb75/hello-libs

    2.1. In your Android Studio Project, open the CMakeLists.txt

    2.2. Add the following:

        # This will create a new "variable" holding the path to a directory
        # where we will put our library and header files.
        # Change this to your needs
        set(distribution_DIR ${CMAKE_SOURCE_DIR}/distribution)
    
        # This states that there exists a shared library called libtensorflowLite
        # which will be imported (means it is not built with the rest of the project!)
        add_library(libtensorflowLite SHARED IMPORTED)
    
        # This indicates where the libtensorflowLite.so for each architecture is found relative to our distribution directory
        set_target_properties(libtensorflowLite PROPERTIES IMPORTED_LOCATION
            ${distribution_DIR}/lib/${ANDROID_ABI}/libtensorflowLite.so)
    
        # This indicates where the header files are found relative to our distribution dir
        target_include_directories(native-lib PRIVATE
                           ${distribution_DIR}/include)
    
        # Finally, we make sure our libtensorflowLite.so is linked to our native-lib and loaded during runtime 
        target_link_libraries( # Specifies the target library.
                       native-lib
                       libtensorflowLite
                       # Links the target library to the log library
                       # included in the NDK.
                       ${log-lib} )
    

    2.3. Open the build.gradle for your Module: App (not the project one!)

    2.4. Make sure our library will be packed into your APK

    Add this inside the Android section:

        sourceSets {
            main {
                // let gradle pack the shared library into apk
                jni.srcDirs = []
                jniLibs.srcDirs = ['distribution/lib']
            }
        }
    

    You may have to edit the path accoding to your needs: The files here will be packed in to your .apk inside the lib directory.

    3. Include flatbuffers

    TensorflowLite uses the flatbuffers serialization library. I guess this will be added automatically if you build your project using bazel. But this is not the case when using Android Studio. Of course, you could also add a static or shared library too. However, for me it was easiest to just let flatbuffers compile each time with the rest of my app (it is not that big). I copied all of the flatbuffers *.cpp source files to my project and added them to the CMakeLists.

    4. Copy the needed headers for TensorflowLite and flatbuffers

    In 3. I just copied the cpp files to my project. However, the header files need to be located in the directory we set in target_include_directories in step 2.2.

    So go ahead and copy all of the flatbuffers (from the flatbuffers repository) *.h files to this directory. Next, from the TensorflowLite repository, you need all header files inside the tensorflow/contrib/lite directory. However you should keep the folder structure

    For me it looks like this:

    • distribution
      • lib
        • arm64-v8a
          • libtensorflowLite
        • armeabi-v7a
          • libtensorflowLite
      • include
        • flatbuffers
        • tensorflow
          • contrib
            • lite
              • kernels
              • nnapi
              • schema
              • tools

    So, if I haven't forgotten anything everything should be set up correctly by now! Hopefully this helped and it worked for you as it did for me ;)

    Best regards,

    Martin

    0 讨论(0)
提交回复
热议问题