Android, Data Science, Machine Learning, Mobile, Programming

Android Nuggets 5: Machine Learning on Android

Android supports ML on android devices using the ML kit and tflite support. Android does not support learning as such but rather inference on Android, you cannot train your model on an Android device rather you take a pre-trained model and deploy it in device to use it for making inferences. Inference could be a computer vision use case or Natural Language use case.

Android users get a choice to create and train their own custom model using TensorFlow and then optimise it to be used on tensorflow lite on Android. The model is generally a few MBs in size. The advantage of having a model deployed on device is that it helps in quick inference as compared to calling a cloud api for the same. 

The other way to use ML models is through MLKit. MLKit comes with its own pre-trained models that are optimised to be used on Android Devices for eg Object detection and tracking model. They have  a very high accuracy as there are several deep layers in them as they are trained with huge data,These models can be directly deployed and used by the application .MLKit also allows you to specify your own customised and optimised model for the features that are supported by ML Kit . You can also use an existing MLKit model and use transfer learning and then customise and use it. MLKit uses tflite beneath it.MLKit supports ML Vision and Language Processing usecases like barcode scanning, image labelling, text recognition, language detection, smart replies, on device translation etc. 

Android also comes with a NeuralNetwork Api that can help you to choose the right hardware chip for running your model. For eg it could decide to use a GPU instead of the usual CPU and hence can accelerate the performance of your models.ML Kit has a NNApi delegate with helps in choosing the right hardware for running the model.

Since Android Studio version 4.1 and beyond, Android team has made deployment of custom tflite models quite easier using ML Model Binding. You can simply import your tflite model using File-> New->Other ->TensorFlowLite and then selecting the tflite model that you want to use in your application. This will import the model in your src/main/ml folder and generate the necessary bindings to consume the model and expose necessary high level apis to perform the inference. During the importing of tflite model the android studio shows an overview page highlighting the inputs and outputs and also shows sample java and kotlin code to consume the high level apis generated by it. All this is possible due to the meta data stored in tflite models. 

Right-click menus to access the TensorFlow Lite import functionality

Android also supports federated learning.Federated Learning enables mobile phones to collaboratively learn a shared prediction model while keeping all the training data on device.Google is now using “federated learning” on Android in an effort to reduce “Hey Google” misactivations and misses. Federated learning is allowing Google to improve machine learning models without sending any raw data to Google servers.

Leave a Reply