使用 Firebase AI Logic 的製作檢查清單
透過集合功能整理內容
你可以依據偏好儲存及分類內容。
準備好推出應用程式,讓實際使用者與生成式 AI 功能互動時,請務必查看這份最佳做法和重要考量事項的檢查清單。
一般
查看使用 Firebase 的應用程式一般發布檢查清單
這份 Firebase 發布檢查清單說明瞭將任何 Firebase 應用程式發布至正式環境前,應遵循的重要最佳做法。
確保 Firebase 專案符合最佳做法
舉例來說,請務必為開發、測試和正式環境使用不同的 Firebase 專案。如要瞭解更多管理專案的最佳做法,請參閱這篇文章。
存取權和安全性
查看使用 Firebase 的應用程式一般安全檢查清單
這份安全性檢查清單說明瞭 Firebase 應用程式和服務的存取權與安全性相關重要最佳做法。
開始強制執行 Firebase App Check
Firebase App Check 可保護 API,讓您存取 Gemini 和 Imagen 模型。App Check 會驗證要求是否來自您的實際應用程式和正版裝置,這項服務支援 Apple 平台 (DeviceCheck 或 App Attest)、Android (Play Integrity) 和網頁 (reCAPTCHA Enterprise) 的認證供應商,也支援 Flutter 和 Unity 應用程式的所有這些供應商。
為 Firebase API 金鑰設定限制
請注意,Firebase 相關 API 使用 API 金鑰僅是為了識別 Firebase 專案或應用程式,並非授權呼叫 API。
帳單、監控和配額
避免收到非預期帳單
如果 Firebase 專案採用即付即用 Blaze 定價方案,請監控用量並設定預算快訊。
在 Firebase 控制台中設定 AI 監控
只有在 Vertex AI Gemini API 是 API 供應商時,才能使用這項功能。
|
設定 AI 監控,在 Firebase 控制台中觀察各種指標和資訊主頁,全面掌握來自 Firebase AI Logic SDK 的要求。
查看必要基礎 API 的配額
管理設定
在正式版應用程式中使用穩定模型版本
在正式版應用程式中,請只使用穩定版模型版本 (例如 gemini-2.0-flash-001
),不要使用預先發布或實驗版本,也不要使用自動更新別名。
即使自動更新的穩定別名指向穩定版,但只要有新的穩定版發布,別名指向的實際模型版本就會自動變更,這可能導致非預期的行為或回應。此外,預先發布和實驗版本僅建議用於原型設計。
設定及使用 Firebase Remote Config
使用 Remote Config,您可以在雲端控制生成式 AI 功能的重要設定,而不必在程式碼中硬式編碼值。也就是說,您不必發布新版應用程式,就能更新設定。Remote Config 的用途十分廣泛,但我們建議您透過遠端控制生成式 AI 功能的下列重要值:
您也可以選擇在 Remote Config 中設定 minimum_version
參數,比較應用程式目前版本與 Remote Config 定義的最新版本,然後向使用者顯示升級通知或強制升級。
設定模型存取位置
只有在 Vertex AI Gemini API 是 API 供應商時,才能使用這項功能。
|
設定模型存取位置有助於控制成本,並避免使用者發生延遲問題。
如未指定位置,預設值為 us-central1
。您可以在初始化期間設定這個位置,也可以選擇使用 Firebase Remote Config 根據每個使用者的位置動態變更位置。
除非另有註明,否則本頁面中的內容是採用創用 CC 姓名標示 4.0 授權,程式碼範例則為阿帕契 2.0 授權。詳情請參閱《Google Developers 網站政策》。Java 是 Oracle 和/或其關聯企業的註冊商標。
上次更新時間:2025-08-25 (世界標準時間)。
[null,null,["上次更新時間:2025-08-25 (世界標準時間)。"],[],[],null,["\u003cbr /\u003e\n\nWhen you're ready to launch your app and have real end users interact with your\ngenerative AI features, make sure to review this checklist of best practices and\nimportant considerations.\n| You can complete many of these checklist items as soon as you start to seriously develop your app and well before launch. \n| **Most importantly, you should enable\n| [Firebase App Check](/docs/ai-logic/app-check)\n| to help secure your app and configure\n| [Firebase Remote Config](/docs/ai-logic/solutions/remote-config)\n| to allow on-demand changes to AI parameters (like model name) without an app\n| update.**\n\nGeneral\n\nReview the general launch checklist for apps that use Firebase\n\nThis [Firebase launch checklist](/support/guides/launch-checklist) describes\nimportant best practices before launching any Firebase app to production.\n\nMake sure your Firebase projects follow best practices\n\nFor example, make sure that you use different Firebase projects for development,\ntesting, and production. Review more best practices for\n[managing your projects](/support/guides/launch-checklist#projects-follow-best-practices).\n\nAccess and security\n\nReview the general security checklist for apps that use Firebase\n\nThis [security checklist](/support/guides/security-checklist) describes\nimportant best practices for access and security for Firebase apps and services.\n\nStart *enforcing* Firebase App Check\n\n[Firebase App Check](/docs/ai-logic/app-check) helps protect the APIs that\ngive you access to the Gemini and Imagen models.\nApp Check verifies that requests are from your actual app and an authentic,\nuntampered device. It supports attestation providers for\nApple platforms (DeviceCheck or App Attest), Android (Play Integrity), and\nWeb (reCAPTCHA Enterprise), and it supports all these providers for Flutter and\nUnity apps, as well.\n\nSet up restrictions for your Firebase API keys\n\n- Review each Firebase API key's\n [\"API restrictions\"](https://cloud.google.com/docs/authentication/api-keys#adding_api_restrictions)\n allowlist:\n\n - Make sure that the Firebase AI Logic API is in the\n allowlist.\n\n - Make sure that the only other APIs in the key's allowlist are for Firebase\n services that you use in your app. See the\n [list of which APIs are required to be on the allowlist for each product](/docs/projects/api-keys#faq-required-apis-for-restricted-firebase-api-key).\n\n- Set\n [\"Application restrictions\"](https://cloud.google.com/docs/authentication/api-keys#adding_application_restrictions)\n to help restrict usage of each Firebase API key to only requests from your app\n (for example, a matching bundle ID for the Apple app). Note that even if you\n restrict your key, Firebase App Check is still strongly recommended.\n\nNote that Firebase-related APIs use API keys only to *identify* the Firebase\nproject or app, *not for authorization* to call the API.\n\nBilling, monitoring, and quota\n\nAvoid surprise bills\n\nIf your Firebase project is on the pay-as-you-go Blaze pricing plan, then\n[monitor your usage](/docs/ai-logic/monitoring) and\n[set up budget alerts](/docs/projects/billing/avoid-surprise-bills#set-up-budget-alert-emails).\n\nSet up AI monitoring in the Firebase console\n\n\n|----------------------------------------------------------------------------|\n| *Only available when using the Vertex AI Gemini API as your API provider.* |\n\n\u003cbr /\u003e\n\n[Set up AI monitoring](/docs/ai-logic/monitoring#ai-monitoring-in-console) to\nobserve various metrics and dashboards in the Firebase console to gain\ncomprehensive visibility into your requests from the\nFirebase AI Logic SDKs.\n\nReview your quotas for the required underlying APIs\n\n- Make sure that you\n [understand the quotas for each required API](/docs/ai-logic/quotas#understand-quotas).\n\n- [Set rate limits per user](/docs/ai-logic/quotas#understand-quotas-vertexai-in-firebase)\n (the default is 100 RPM).\n\n- [Edit quota or request a quota increase](/docs/ai-logic/quotas#edit-quota-or-request-quota-increase),\n as needed.\n\nManagement of configurations\n\nUse a stable model version in your production app\n\nIn your production app, only use\n[*stable* model versions](/docs/ai-logic/models#versions) (like\n`gemini-2.0-flash-001`), not a *preview* or *experimental* version or\nan *auto-updated* alias.\n\nEven though an *auto-updated* stable alias points to a stable version, the\nactual model version it points to will automatically change whenever a new\nstable version is released, which could mean unexpected behavior or responses.\nAlso, *preview* and *experimental* versions are only recommended during\nprototyping.\n| **Important:** We strongly recommend using [Firebase Remote Config](/docs/ai-logic/solutions/remote-config) to control and update the model name used in your app (see the next section).\n\nSet up and use Firebase Remote Config\n\nWith [Remote Config](/docs/ai-logic/solutions/remote-config),\nyou can control important configurations for your generative AI feature\n*in the cloud* rather than hard-coding values in your\ncode. This means that you can update your configuration without releasing\na new version of your app. You can do a lot with Remote Config, but here\nare the top values that we recommend you control remotely for your generative\nAI feature:\n\n- Keep your app up-to-date.\n\n - **Model name**: Update the model your app uses as new models are released or others are discontinued.\n- Adjust values and inputs based on client attributes, or to accommodate\n feedback from testing or users.\n\n - **Model configuration**: Adjust the temperature, max output tokens, and\n more.\n\n - **Safety settings**: Adjust safety settings if too many responses are\n getting blocked or if users report harmful responses.\n\n - **System instructions** and **any prompts that you provide**: Adjust the\n additional context that you're sending to the model to steer its\n responses and behavior. For example, you might want to tailor prompts for\n specific client types, or personalize prompts for new users that differ from\n those used to generate responses for existing users.\n\nYou could also optionally set a `minimum_version` parameter in Remote Config\nto compare the app's current version with the Remote Config-defined latest\nversion, to either show an upgrade notification to users or force users to\nupgrade.\n\nSet the location for accessing the model\n\n\n|----------------------------------------------------------------------------|\n| *Only available when using the Vertex AI Gemini API as your API provider.* |\n\n\u003cbr /\u003e\n\n[Setting a location for accessing the model](/docs/ai-logic/locations) can help\nwith costs as well as help prevent latency for your users.\n\nIf you don't specify a location, the default is `us-central1`. You can set this\nlocation during initialization, or you can optionally\n[use Firebase Remote Config to dynamically change the location based on each user's location](/docs/ai-logic/solutions/remote-config)."]]