使用 Firebase AI Logic 的生产环境核对清单
使用集合让一切井井有条
根据您的偏好保存内容并对其进行分类。
当您准备好发布应用并让真实最终用户与生成式 AI 功能互动时,请务必查看此清单,了解最佳实践和重要注意事项。
常规
查看使用 Firebase 的应用的一般发布核对清单
此 Firebase 发布核对清单介绍了在将任何 Firebase 应用发布到生产环境之前应遵循的重要最佳实践。
确保您的 Firebase 项目遵循最佳实践
例如,请确保您使用不同的 Firebase 项目来开发、测试和生产。查看有关管理项目的更多最佳实践。
访问权限和安全
查看使用 Firebase 的应用的一般安全核对清单
此安全核对清单介绍了 Firebase 应用和服务在访问权限和安全性方面的重要最佳实践。
开始强制执行 Firebase App Check
App Check 通过验证请求是否来自您的实际应用,帮助保护访问 Gemini 和 Imagen 模型的 API。它支持 Apple 平台(DeviceCheck 或 App Attest)、Android(Play Integrity)和 Web(reCAPTCHA Enterprise)的证明提供程序。
为 Firebase API 密钥设置限制
请注意,Firebase 相关 API 仅使用 API 密钥来标识 Firebase 项目或应用,不用于授权调用 API。
结算、监控和配额
避免出现意外费用
如果您的 Firebase 项目采用的是随用随付 Blaze 定价方案,请监控用量并设置预算提醒。
在 Firebase 控制台中设置 AI 监控
仅在将 Vertex AI Gemini API 用作 API 提供方时可用。
|
设置 AI 监控,以便在 Firebase 控制台中查看各种指标和信息中心,从而全面了解来自 Firebase AI Logic SDK 的请求。
查看所需底层 API 的配额
配置管理
在生产应用中使用稳定的模型版本
在正式版应用中,请仅使用稳定版模型版本(例如 gemini-2.0-flash-001
),而不要使用预览版、实验版或自动更新别名。
虽然自动更新稳定版别名指向的是稳定版,但它所指向的实际模型版本会在每次发布新的稳定版时自动更改,这可能会导致意外的行为或回答。此外,预览版和实验版仅建议在原型设计期间使用。
设置和使用 Firebase Remote Config
借助 Remote Config,您可以在云端控制生成式 AI 功能的重要配置,而不是在代码中对值进行硬编码。这意味着,您无需发布应用的新版本即可更新配置。Remote Config 的用途非常广泛,但我们建议您远程控制以下值,以用于生成式 AI 功能:
您还可以选择在 Remote Config 中设置 minimum_version
参数,以将应用的当前版本与 Remote Config 定义的最新版本进行比较,从而向用户显示升级通知或强制用户升级。
设置访问模型的位置
仅在将 Vertex AI Gemini API 用作 API 提供方时可用。
|
设置用于访问模型的位置有助于控制费用,还可以帮助防止用户遇到延迟问题。
如果您未指定位置,则默认值为 us-central1
。您可以在初始化期间设置此位置,也可以选择使用 Firebase Remote Config 根据每个用户的地理位置动态更改位置。
如未另行说明,那么本页面中的内容已根据知识共享署名 4.0 许可获得了许可,并且代码示例已根据 Apache 2.0 许可获得了许可。有关详情,请参阅 Google 开发者网站政策。Java 是 Oracle 和/或其关联公司的注册商标。
最后更新时间 (UTC):2025-08-19。
[null,null,["最后更新时间 (UTC):2025-08-19。"],[],[],null,["\u003cbr /\u003e\n\nWhen you're ready to launch your app and have real end users interact with your\ngenerative AI features, make sure to review this checklist of best practices and\nimportant considerations.\n| You can complete many of these checklist items as soon as you start to seriously develop your app and well before launch. \n| **Most importantly, you should enable\n| [Firebase App Check](/docs/ai-logic/app-check)\n| to help secure your app and configure\n| [Firebase Remote Config](/docs/ai-logic/solutions/remote-config)\n| to allow on-demand changes to AI parameters (like model name) without an app\n| update.**\n\nGeneral\n\nReview the general launch checklist for apps that use Firebase\n\nThis [Firebase launch checklist](/support/guides/launch-checklist) describes\nimportant best practices before launching any Firebase app to production.\n\nMake sure your Firebase projects follow best practices\n\nFor example, make sure that you use different Firebase projects for development,\ntesting, and production. Review more best practices for\n[managing your projects](/support/guides/launch-checklist#projects-follow-best-practices).\n\nAccess and security\n\nReview the general security checklist for apps that use Firebase\n\nThis [security checklist](/support/guides/security-checklist) describes\nimportant best practices for access and security for Firebase apps and services.\n\nStart *enforcing* Firebase App Check\n\n[App Check](/docs/ai-logic/app-check) helps protect the APIs that access\nthe Gemini and Imagen models by verifying that requests are\nfrom your actual app. It supports attestation providers for\nApple platforms (DeviceCheck or App Attest), Android (Play Integrity), and\nWeb (reCAPTCHA Enterprise).\n\nSet up restrictions for your Firebase API keys\n\n- Review each Firebase API key's\n [\"API restrictions\"](https://cloud.google.com/docs/authentication/api-keys#adding_api_restrictions)\n allowlist:\n\n - Make sure that the Firebase AI Logic API is in the\n allowlist.\n\n - Make sure that the only other APIs in the key's allowlist are for Firebase\n services that you use in your app. See the\n [list of which APIs are required to be on the allowlist for each product](/docs/projects/api-keys#faq-required-apis-for-restricted-firebase-api-key).\n\n- Set\n [\"Application restrictions\"](https://cloud.google.com/docs/authentication/api-keys#adding_application_restrictions)\n to help restrict usage of each Firebase API key to only requests from your app\n (for example, a matching bundle ID for the Apple app). Note that even if you\n restrict your key, Firebase App Check is still strongly recommended.\n\nNote that Firebase-related APIs use API keys only to *identify* the Firebase\nproject or app, *not for authorization* to call the API.\n\nBilling, monitoring, and quota\n\nAvoid surprise bills\n\nIf your Firebase project is on the pay-as-you-go Blaze pricing plan, then\n[monitor your usage](/docs/ai-logic/monitoring) and\n[set up budget alerts](/docs/projects/billing/avoid-surprise-bills#set-up-budget-alert-emails).\n\nSet up AI monitoring in the Firebase console\n\n\n|----------------------------------------------------------------------------|\n| *Only available when using the Vertex AI Gemini API as your API provider.* |\n\n\u003cbr /\u003e\n\n[Set up AI monitoring](/docs/ai-logic/monitoring#ai-monitoring-in-console) to\nobserve various metrics and dashboards in the Firebase console to gain\ncomprehensive visibility into your requests from the\nFirebase AI Logic SDKs.\n\nReview your quotas for the required underlying APIs\n\n- Make sure that you\n [understand the quotas for each required API](/docs/ai-logic/quotas#understand-quotas).\n\n- [Set rate limits per user](/docs/ai-logic/quotas#understand-quotas-vertexai-in-firebase)\n (the default is 100 RPM).\n\n- [Edit quota or request a quota increase](/docs/ai-logic/quotas#edit-quota-or-request-quota-increase),\n as needed.\n\nManagement of configurations\n\nUse a stable model version in your production app\n\nIn your production app, only use\n[*stable* model versions](/docs/ai-logic/models#versions) (like\n`gemini-2.0-flash-001`), not a *preview* or *experimental* version or\nan *auto-updated* alias.\n\nEven though an *auto-updated* stable alias points to a stable version, the\nactual model version it points to will automatically change whenever a new\nstable version is released, which could mean unexpected behavior or responses.\nAlso, *preview* and *experimental* versions are only recommended during\nprototyping.\n| **Important:** We strongly recommend using [Firebase Remote Config](/docs/ai-logic/solutions/remote-config) to control and update the model name used in your app (see the next section).\n\nSet up and use Firebase Remote Config\n\nWith [Remote Config](/docs/ai-logic/solutions/remote-config),\nyou can control important configurations for your generative AI feature\n*in the cloud* rather than hard-coding values in your\ncode. This means that you can update your configuration without releasing\na new version of your app. You can do a lot with Remote Config, but here\nare the top values that we recommend you control remotely for your generative\nAI feature:\n\n- Keep your app up-to-date.\n\n - **Model name**: Update the model your app uses as new models are released or others are discontinued.\n- Adjust values and inputs based on client attributes, or to accommodate\n feedback from testing or users.\n\n - **Model configuration**: Adjust the temperature, max output tokens, and\n more.\n\n - **Safety settings**: Adjust safety settings if too many responses are\n getting blocked or if users report harmful responses.\n\n - **System instructions** and **any prompts that you provide**: Adjust the\n additional context that you're sending to the model to steer its\n responses and behavior. For example, you might want to tailor prompts for\n specific client types, or personalize prompts for new users that differ from\n those used to generate responses for existing users.\n\nYou could also optionally set a `minimum_version` parameter in Remote Config\nto compare the app's current version with the Remote Config-defined latest\nversion, to either show an upgrade notification to users or force users to\nupgrade.\n\nSet the location for accessing the model\n\n\n|----------------------------------------------------------------------------|\n| *Only available when using the Vertex AI Gemini API as your API provider.* |\n\n\u003cbr /\u003e\n\n[Setting a location for accessing the model](/docs/ai-logic/locations) can help\nwith costs as well as help prevent latency for your users.\n\nIf you don't specify a location, the default is `us-central1`. You can set this\nlocation during initialization, or you can optionally\n[use Firebase Remote Config to dynamically change the location based on each user's location](/docs/ai-logic/solutions/remote-config)."]]