カスタム TensorFlow Lite モデルを使用する場合、Firebase ML はユーザーが常に最適なバージョンのカスタムモデルを使用できるようにします。Firebase でモデルをデプロイすると、Firebase ML は必要なときにのみモデルをダウンロードし、自動的に最新バージョンに更新します。
省略可: TensorFlow Lite モデルを Firebase にデプロイし、Firebase ML SDK をアプリに組み込むと、Firebase ML がユーザーのモデルを最新バージョンに保ちます。ユーザーのデバイスがアイドル状態、充電中、または Wi-Fi 接続がある場合にモデルの更新を自動的にダウンロードするように ML Kit を構成できます。
推論に TensorFlow Lite モデルを使用する
Apple アプリまたは Android アプリで TensorFlow Lite インタープリタを使用して、Firebase でデプロイされたモデルで推論を行います。
Codelab
いくつかの codelabs をお試しいただくと、TensorFlow Lite モデルをより簡単かつ効果的に使用するのに Firebase がどう役立つかを実践的に学ぶことできます。
[null,null,["最終更新日 2025-08-04 UTC。"],[],[],null,["# Custom Models\n=============\n\nplat_ios plat_android \nIf you use custom\n[TensorFlow Lite](https://www.tensorflow.org/lite/) models,\nFirebase ML can help you ensure your users are always using the\nbest-available version of your custom model. When you deploy your model with\nFirebase, Firebase ML only downloads the model when it's needed and\nautomatically updates your users with the latest version.\n\n\u003cbr /\u003e\n\nReady to get started? Choose your platform:\n\n[iOS+](/docs/ml/ios/use-custom-models)\n[Android](/docs/ml/android/use-custom-models)\n\n\u003cbr /\u003e\n\n| This is a beta release of Firebase ML. This API might be changed in backward-incompatible ways and is not subject to any SLA or deprecation policy.\n\nKey capabilities\n----------------\n\n|----------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| TensorFlow Lite model deployment | Deploy your models using Firebase to reduce your app's binary size and to make sure your app is always using the most recent version available of your model |\n| On-device ML inference | Perform inference in an Apple or Android app using the TensorFlow Lite interpreter with your model. |\n| Automatic model updates | Configure the conditions under which your app automatically downloads new versions of your model: when the user's device is idle, is charging, or has a Wi-Fi connection |\n\nImplementation path\n-------------------\n\n|---|---------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| | **Train your TensorFlow model** | Build and train a custom model using TensorFlow. Or, re-train an existing model that solves a problem similar to what you want to achieve. |\n| | **Convert the model to TensorFlow Lite** | Convert your model from HDF5 or frozen graph format to TensorFlow Lite using the [TensorFlow Lite converter](https://www.tensorflow.org/lite/convert). |\n| | **Deploy your TensorFlow Lite model to Firebase** | Optional: When you deploy your TensorFlow Lite model to Firebase and include the Firebase ML SDK in your app, Firebase ML keeps your users up to date with the latest version of your model. You can configure it to automatically download model updates when the user's device is idle or charging, or has a Wi-Fi connection. |\n| | **Use the TensorFlow Lite model for inference** | Use the TensorFlow Lite interpreter in your Apple or Android app to perform inference with models deployed using Firebase. |\n\nCodelabs\n--------\n\nTry some [codelabs](/docs/ml/codelabs) to learn hands-on how Firebase can help you use\nTensorFlow Lite models more easily and effectively."]]