How to deploy a tensorflow model on Android without TF Lite and without using a server


This question is about finding a solution on how to run a trained model on an Android device without using the convert TF Lite and without using a external service.

I don’t owned the model and cannot modify it. I just have the trained saved model files.

The device is out of network and should embed the trained model. No connection to an external server is possible.

Tensorflow Lite is not an option since TF Lite doesn’t support 5D tensors:

In order to do my test I will get the basic model I have provided in the above tensorflow issue to do my tests.

I have found this blog article, but didn’t manage to make it work yet:

Do you know any updated solution that enables to load the model inside a Java or C++ lib on Android?

No example is proposed by Tensorflow on their GitHub:


I have succeeded to deploy my trained model using 5D tensor on Android Emulator.

In order to do that, I have converted my model using the converter from Tensorflow to ONNX:

python -m tf2onnx.convert --saved-model tensorflow-model-path --output model.onnx

Then I have created a C++ lib that loads the ONNX model from the converted file and calls it.

In order to copy the asset on the phone storage, I have followed this topic:

You can find ONNX samples here:

And finally I have integrated the C++ lib in Android like this:

If I have enough time, I will try to use the TF API.

Answered By – Thibault

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published