This question is about finding a solution on how to run a trained model on an Android device without using the convert TF Lite and without using a external service.
I don’t owned the model and cannot modify it. I just have the trained saved model files.
The device is out of network and should embed the trained model. No connection to an external server is possible.
Tensorflow Lite is not an option since TF Lite doesn’t support 5D tensors: https://github.com/tensorflow/tensorflow/issues/56946
In order to do my test I will get the basic model I have provided in the above tensorflow issue to do my tests.
I have found this blog article, but didn’t manage to make it work yet: https://medium.com/@vladislavsd/undocumented-tensorflow-c-api-b527c0b4ef6
Do you know any updated solution that enables to load the model inside a Java or C++ lib on Android?
No example is proposed by Tensorflow on their GitHub: https://github.com/tensorflow/tensorflow/tree/master/tensorflow/examples/android
I have succeeded to deploy my trained model using 5D tensor on Android Emulator.
In order to do that, I have converted my model using the converter from Tensorflow to ONNX: https://github.com/onnx/tensorflow-onnx
python -m tf2onnx.convert --saved-model tensorflow-model-path --output model.onnx
Then I have created a C++ lib that loads the ONNX model from the converted file and calls it.
In order to copy the asset on the phone storage, I have followed this topic: https://stackoverflow.com/a/69941051/12851157
You can find ONNX samples here: https://github.com/microsoft/onnxruntime-inference-examples/tree/main/c_cxx
And finally I have integrated the C++ lib in Android like this: https://github.com/android/ndk-samples/tree/master/hello-libs
If I have enough time, I will try to use the TF API.
Answered By – Thibault