This guide shows you how to get started making calls to the Vertex AI Gemini API directly from your app using the Vertex AI in Firebase SDK for your chosen platform.
Optionally experiment with an alternative "Google AI" version of the Gemini API
Get free-of-charge access (within limits and where available) using Google AI Studio and Google AI client SDKs. These SDKs should be used for prototyping only in mobile and web apps.After you're familiar with how a Gemini API works, migrate to our Vertex AI in Firebase SDKs (this documentation), which have many additional features important for mobile and web apps, like protecting the API from abuse using Firebase App Check and support for large media files in requests.
Optionally call the Vertex AI Gemini API server-side (like with Python, Node.js, or Go)
Use the server-side Vertex AI SDKs, Firebase Genkit, or Firebase Extensions for the Gemini API.
Prerequisites
This guide assumes that you're familiar with using Android Studio to develop apps for Android.
Make sure that your development environment and Android app meet the following requirements:
- Android Studio (latest version)
- Your Android app must target API level 21 or higher.
(Optional) Check out the sample app.
You can try out the SDK quickly, see a complete implementation of various use cases, or use the sample app if don't have your own Android app. To use the sample app, you'll need to connect it to a Firebase project.
Step 1: Set up a Firebase project and connect your app to Firebase
If you already have a Firebase project and an app connected to Firebase
In the Firebase console, go to the Build with Gemini page.
Click the Vertex AI in Firebase card to launch a workflow that helps you complete the following tasks:
Upgrade your project to use the pay-as-you-go Blaze pricing plan.
Enable the required APIs in your project (Vertex AI API and Vertex AI in Firebase API).
Continue to the next step in this guide to add the SDK to your app.
If you do not already have a Firebase project and an app connected to Firebase
Sign into the Firebase console.
Click Create project, and then use either of the following options:
Option 1: Create a wholly new Firebase project (and its underlying Google Cloud project automatically) by entering a new project name in the first step of the "Create project" workflow.
Option 2: "Add Firebase" to an existing Google Cloud project by selecting your Google Cloud project name from the drop-down menu in the first step of the "Create project" workflow.
Note that when prompted, you do not need to set up Google Analytics to use the Vertex AI in Firebase SDKs.
In the Firebase console, go to the Build with Gemini page.
Click the Vertex AI in Firebase card to launch a workflow that helps you complete the following tasks:
Upgrade your project to use the pay-as-you-go Blaze pricing plan.
Enable the required APIs in your project (Vertex AI API and Vertex AI in Firebase API).
Continue in the console's generative AI workflow to connect your app to Firebase, which includes these tasks:
Registering your app with your Firebase project.
Adding your Firebase configuration file (
) and thegoogle-services.json
Gradle plugin to your app.google-services
In the next steps of this guide, you'll add the Vertex AI in Firebase SDK to your app and complete the required initialization specific to using the SDK and the Gemini API.
Step 2: Add the SDK
With your Firebase project set up and your app connected to Firebase (see previous step), you can now add the Vertex AI in Firebase SDK to your app.
The Vertex AI in Firebase SDK for Android (firebase-vertexai
) provides
access to the Vertex AI Gemini API.
In your module (app-level) Gradle file
(like <project>/<app-module>/build.gradle.kts
),
add the dependency for the Vertex AI in Firebase library for Android.
We recommend using the
Firebase Android BoM
to control library versioning.
Kotlin+KTX
dependencies { // ... other androidx dependencies // Import the BoM for the Firebase platform implementation(platform("com.google.firebase:firebase-bom:33.7.0")) // Add the dependency for the Vertex AI in Firebase library // When using the BoM, you don't specify versions in Firebase library dependencies implementation("com.google.firebase:firebase-vertexai") }
Java
For Java, you need to add two additional libraries.
dependencies { // ... other androidx dependencies // Import the BoM for the Firebase platform implementation(platform("com.google.firebase:firebase-bom:33.7.0")) // Add the dependency for the Vertex AI in Firebase library // When using the BoM, you don't specify versions in Firebase library dependencies implementation("com.google.firebase:firebase-vertexai") // Required for one-shot operations (to use `ListenableFuture` from Guava Android) implementation("com.google.guava:guava:31.0.1-android") // Required for streaming operations (to use `Publisher` from Reactive Streams) implementation("org.reactivestreams:reactive-streams:1.0.4") }
By using the Firebase Android BoM, your app will always use compatible versions of Firebase Android libraries.
If you choose not to use the Firebase BoM, you must specify each Firebase library version in its dependency line.
Note that if you use multiple Firebase libraries in your app, we strongly recommend using the BoM to manage library versions, which ensures that all versions are compatible.
dependencies { // Add the dependency for the Vertex AI in Firebase library // When NOT using the BoM, you must specify versions in Firebase library dependencies implementation("com.google.firebase:firebase-vertexai:16.0.2") }
Step 3: Initialize the Vertex AI service and the generative model
Before you can make any API calls, you need to initialize the Vertex AI service and the generative model.
Kotlin+KTX
For Kotlin, the methods in this SDK are suspend functions and need to be called from a Coroutine scope.// Initialize the Vertex AI service and the generative model
// Specify a model that supports your use case
// Gemini 1.5 models are versatile and can be used with all API capabilities
val generativeModel = Firebase.vertexAI.generativeModel("gemini-1.5-flash")
Java
For Java, the streaming methods in this SDK return aPublisher
type from the Reactive Streams library.
// Initialize the Vertex AI service and the generative model
// Specify a model that supports your use case
// Gemini 1.5 models are versatile and can be used with all API capabilities
GenerativeModel gm = FirebaseVertexAI.getInstance()
.generativeModel("gemini-1.5-flash");
// Use the GenerativeModelFutures Java compatibility layer which offers
// support for ListenableFuture and Publisher APIs
GenerativeModelFutures model = GenerativeModelFutures.from(gm);
When you've finished the getting started guide, learn how to choose a Gemini model and (optionally) a location appropriate for your use case and app.
Step 4: Call the Vertex AI Gemini API
Now that you've connected your app to Firebase, added the SDK, and initialized the Vertex AI service and the generative model, you're ready to call the Vertex AI Gemini API.
You can use generateContent()
to generate text from a text-only prompt
request:
Kotlin+KTX
For Kotlin, the methods in this SDK are suspend functions and need to be called from a Coroutine scope.// Initialize the Vertex AI service and the generative model
// Specify a model that supports your use case
// Gemini 1.5 models are versatile and can be used with all API capabilities
val generativeModel = Firebase.vertexAI.generativeModel("gemini-1.5-flash")
// Provide a prompt that contains text
val prompt = "Write a story about a magic backpack."
// To generate text output, call generateContent with the text input
val response = generativeModel.generateContent(prompt)
print(response.text)
Java
For Java, the methods in this SDK return aListenableFuture
.
// Initialize the Vertex AI service and the generative model
// Specify a model that supports your use case
// Gemini 1.5 models are versatile and can be used with all API capabilities
GenerativeModel gm = FirebaseVertexAI.getInstance()
.generativeModel("gemini-1.5-flash");
GenerativeModelFutures model = GenerativeModelFutures.from(gm);
// Provide a prompt that contains text
Content prompt = new Content.Builder()
.addText("Write a story about a magic backpack.")
.build();
// To generate text output, call generateContent with the text input
ListenableFuture<GenerateContentResponse> response = model.generateContent(prompt);
Futures.addCallback(response, new FutureCallback<GenerateContentResponse>() {
@Override
public void onSuccess(GenerateContentResponse result) {
String resultText = result.getText();
System.out.println(resultText);
}
@Override
public void onFailure(Throwable t) {
t.printStackTrace();
}
}, executor);
What else can you do?
Learn more about the Gemini models
Learn about the models available for various use cases and their quotas and pricing.
Try out other capabilities of the Gemini API
- Learn more about generating text from text-only prompts, including how to stream the response.
- Generate text from multimodal prompts (including text, images, PDFs, video, and audio).
- Build multi-turn conversations (chat).
- Generate structured output (like JSON) from both text and multimodal prompts.
- Use function calling to connect generative models to external systems and information.
Learn how to control content generation
- Understand prompt design, including best practices, strategies, and example prompts.
- Configure model parameters like temperature and maximum output tokens.
- Use safety settings to adjust the likelihood of getting responses that may be considered harmful.
Give feedback about your experience with Vertex AI in Firebase