1. Overview
Welcome to the Text Classification with TensorFlow Lite and Firebase codelab. In this codelab you'll learn how to use TensorFlow Lite and Firebase to train and deploy a text classification model to your app. This codelab is based on this TensorFlow Lite example.
Text classification is the process of assigning tags or categories to text according to its content. It's one of the fundamental tasks in Natural Language Processing (NLP) with broad applications such as sentiment analysis, topic labeling, spam detection, and intent detection.
Sentiment analysis is the interpretation and classification of emotions (positive, negative and neutral) within text data using text analysis techniques. Sentiment analysis allows businesses to identify customer sentiment toward products, brands or services in online conversations and feedback.
This tutorial shows how to build a machine learning model for sentiment analysis, in particular classifying text as positive or negative. This is an example of binary—or two-class—classification, an important and widely applicable kind of machine learning problem.
What you'll learn
- Train a TF Lite sentiment analysis models with TF Lite Model Maker
- Deploy TF Lite models to Firebase ML and access them from your app
- Integrate TF Lite sentiment analysis models to your app using TF Lite Task Library
What you'll need
- Latest Android Studio version.
- Sample code.
- A test device with Android 5.0+ and Google Play services 9.8 or later, or an Emulator with Google Play services 9.8 or later
- If using a device, a connection cable.
How will you use this tutorial?
How would rate your experience with building Android apps?
2. Get the sample code
Clone the GitHub repository from the command line.
$ git clone https://github.com/FirebaseExtended/codelab-textclassification-android.git
If you don't have git installed, you can also download the sample project from its GitHub page or by clicking this link.
3. Import the starter app
From Android Studio, select the codelab-textclassification-android-master
directory ( ) from the sample code download (File > Open > .../codelab-textclassification-android-master/start).
You should now have the start project open in Android Studio.
4. Run the starter app
Now that you have imported the project into Android Studio, you are ready to run the app for the first time. Connect your Android device, and click Run ( )in the Android Studio toolbar.
The app should launch on your device. It only contains a simple UI that makes it easy to integrate and test text classification models in the next steps. At this point, if you try predicting sentiments, the app will only return some dummy results.
5. Create Firebase console project
Add Firebase to the project
- Go to the Firebase console.
- Select Add project.
- Select or enter a Project name.
- Follow the remaining setup steps in the Firebase console, then click Create project (or Add Firebase, if you're using an existing Google project).
6. Add Firebase to the app
- From the overview screen of your new project, click the Android icon to launch the setup workflow.
- Enter the codelab's package name:
org.tensorflow.lite.codelabs.textclassification
Add google-services.json file to your app
After adding the package name and selecting Register**, Click Download google-services.json** to obtain your Firebase Android config file then copy the google-services.json
file into the *app
* directory in your project.
Add google-services plugin to your app
Follow the instructions on the Firebase Console updating the build.gradle.kts
files to add Firebase to your app.
The google-services plugin uses the google-services.json file to configure your application to use Firebase.
Sync your project with gradle files
To be sure that all dependencies are available to your app, you should sync your project with gradle files at this point. Select File > Sync Project with Gradle Files from the Android Studio toolbar.
7. Run the app with Firebase
Now that you have configured the google-services
plugin with your JSON file, you are ready to run the app with Firebase. Connect your Android device, and click Run ( )in the Android Studio toolbar.
The app should launch on your device. At this point, your app should still build successfully.
8. Train a sentiment analysis model
We will use TensorFlow Lite Model Maker to train a text classification model to predict sentiment of a given text.
This step is presented as a Python notebook that you can open in Google Colab. You can choose Runtime > Run all to execute all of the notebook at once.
Open in Colab
After finishing this step, you will have a TensorFlow Lite sentiment analysis model that is ready for deployment to a mobile app.
9. Deploy a model to Firebase ML
Deploying a model to Firebase ML is useful for two main reasons:
- We can keep the app install size small and only download the model if needed
- The model can be updated regularly and with a different release cycle than the entire app
The model can be deployed either via the console, or programmatically, using the Firebase Admin SDK. In this step we will deploy via the console.
First, open the Firebase Console and click on Machine Learning in the left navigation panel. Click ‘Get Started' if you are opening this first time. Then navigate to "Custom" and click on the "Add model" button.
When prompted, name the model sentiment_analysis
and upload the file that you downloaded from Colab in the previous step.
10. Download model from Firebase ML
Choosing when to download the remote model from Firebase into your app can be tricky since TFLite models can grow relatively large. Ideally we want to avoid loading the model immediately when the app launches, since if our model is used for only one feature and the user never uses that feature, we'll have downloaded a significant amount of data for no reason. We can also set download options such as only fetching models when connected to wifi. If you want to ensure that the model is available even without a network connection, it's important to also bundle it without the app as a backup.
For the sake of simplicity, we'll remove the default bundled model and always download a model from Firebase when the app starts for the first time. This way when running sentiment analysis you can be sure that the inference is running with the model provided from Firebase.
In the app/build.gradle.kts
file, add the Firebase Machine Learning dependency.
app/build.gradle.kts
Find this comment:
// TODO 1: Add Firebase ML dependency
Then add:
implementation(platform("com.google.firebase:firebase-bom:32.0.0"))
implementation("com.google.firebase:firebase-ml-modeldownloader:24.1.2")
When asked by Android Studio to sync your project, choose Sync Now.
Then let's add some code to download the model from Firebase.
MainActivity.java
Find this comment:
// TODO 2: Implement a method to download TFLite model from Firebase
Then add:
/** Download model from Firebase ML. */
private synchronized void downloadModel(String modelName) {
CustomModelDownloadConditions conditions = new CustomModelDownloadConditions.Builder()
.requireWifi()
.build();
FirebaseModelDownloader.getInstance()
.getModel("sentiment_analysis", DownloadType.LOCAL_MODEL, conditions)
.addOnSuccessListener(model -> {
try {
// TODO 6: Initialize a TextClassifier with the downloaded model
predictButton.setEnabled(true);
} catch (IOException e) {
Log.e(TAG, "Failed to initialize the model. ", e);
Toast.makeText(
MainActivity.this,
"Model initialization failed.",
Toast.LENGTH_LONG)
.show();
predictButton.setEnabled(false);
}
})
.addOnFailureListener(e -> {
Log.e(TAG, "Failed to download the model. ", e);
Toast.makeText(
MainActivity.this,
"Model download failed, please check your connection.",
Toast.LENGTH_LONG)
.show();
}
);
}
Next, call the downloadModel
method in the activity's onCreate
method.
MainActivity.java
Find this comment:
// TODO 3: Call the method to download TFLite model
Then add:
downloadModel("sentiment_analysis");
11. Integrate the model in your app
Tensorflow Lite Task Library helps you integrate TensorFlow Lite models into your app with just a few lines of code. We will initialize a NLClassifier
instance using the TensorFlow Lite model downloaded from Firebase. Then we will use it to classify the text input from the app users and show the result on the UI.
Add the dependency
Go to the app's Gradle file and add TensorFlow Lite Task Library (Text) in the app's dependencies.
app/build.gradle
Find this comment:
// TODO 4: Add TFLite Task API (Text) dependency
Then add:
implementation("org.tensorflow:tensorflow-lite-task-text:0.3.0")
When asked by Android Studio to sync your project, choose Sync Now.
Initialize a text classifier
Then we will load the sentiment analysis model downloaded from Firebase using the Task Library's NLClassifier
.
MainActivity.java
Let's declare an NLClassifier instance variable. Find this comment:
// TODO 5: Define a NLClassifier variable
Then add:
private NLClassifier textClassifier;
Initialize the textClassifier
variable with the sentiment analysis model downloaded from Firebase. Find this comment:
// TODO 6: Initialize a TextClassifier with the downloaded model
Then add:
textClassifier = NLClassifier.createFromFile(model.getFile());
Classify text
Once the textClassifier
instance has been set up, you can run sentiment analysis with a single method call.
MainActivity.java
Find this comment:
// TODO 7: Run sentiment analysis on the input text
Then add:
List<Category> results = textClassifier.classify(text);
Implement post-processing
Finally, we will convert the output of the model to a descriptive text to show on the screen.
MainActivity.java
Find this comment:
// TODO 8: Convert the result to a human-readable text
Remove the code that generate the dummy result text:
String textToShow = "Dummy classification result.\n";
Then add:
String textToShow = "Input: " + text + "\nOutput:\n";
for (int i = 0; i < results.size(); i++) {
Category result = results.get(i);
textToShow += String.format(" %s: %s\n", result.getLabel(),
result.getScore());
}
textToShow += "---------\n";
12. Run the final app
You have integrated the sentiment analysis model to the app, so let's test it. Connect your Android device, and click Run ( )in the Android Studio toolbar.
The app should be able to correctly predict the sentiment of the movie review that you enter.
13. Power up the app with more Firebase features
Besides hosting your TFLite models, Firebase provides several other features to power up your machine learning use cases:
- Firebase Performance Monitoring to measure your model inference speed running on users' device.
- Firebase Analytics to measure how good your model performs in production by measuring user reaction.
- Firebase A/B Testing to test multiple versions of your model
- Did you remember we trained two versions of our TFLite model earlier? A/B testing is a good way to find out which version performs better in production!
To learn more about how to leverage these features in your app, check out the codelabs below:
14. Congratulations!
In this codelab, you learned how to train a sentiment analysis TFLite model and deploy it to your mobile app using Firebase. To learn more about TFLite and Firebase, take a look at other TFLite samples and the Firebase getting started guides.
What we've covered
- TensorFlow Lite
- Firebase ML
Next Steps
- Measure your model inference speed with Firebase Performance Monitoring.
- Deploy the model from Colab directly to Firebase via the Firebase ML Model Management API.
- Add a mechanism to allow users to feedback on the prediction result, and use Firebase Analytics to track user feedback.
- A/B test the Average Word Vector model and the MobileBERT model with Firebase A/B testing.
Learn More
- Firebase Machine Learning documentation
- TensorFlow Lite documentation
- Measure app performance with Firebase
- A/B Testing models with Firebase