커스텀 TensorFlow Lite 모델을 사용하는 경우 Firebase ML을 사용하여 사용자가 항상 최고의 커스텀 모델 버전을 사용하도록 보장할 수 있습니다. Firebase를 사용하여 모델을 배포하면 Firebase ML이 필요할 때만 모델을 다운로드하고 자동으로 사용자를 최신 버전으로 업데이트합니다.
선택사항: Firebase로 TensorFlow Lite 모델을 배포하고 앱에 Firebase ML SDK를 포함하면 Firebase ML이 최신 버전의 모델을 사용하여 사용자를 최신 상태로 유지합니다. 사용자 기기가 유휴 상태이거나 충전 중이거나 Wi-Fi에 연결될 때 자동으로 모델 업데이트를 다운로드하도록 Firebase ML을 구성할 수 있습니다.
추론에 TensorFlow Lite 모델 사용
Apple 또는 Android 앱에서 TensorFlow Lite 인터프리터를 사용하여 Firebase를 통해 배포된 모델로 추론을 수행합니다.
Codelab
Firebase를 통해 보다 쉽고 효과적으로 TensorFlow Lite 모델을 사용하는 방법을 알아보려면 codelabs을 사용해 보세요.
[null,null,["최종 업데이트: 2025-08-04(UTC)"],[],[],null,["# Custom Models\n=============\n\nplat_ios plat_android \nIf you use custom\n[TensorFlow Lite](https://www.tensorflow.org/lite/) models,\nFirebase ML can help you ensure your users are always using the\nbest-available version of your custom model. When you deploy your model with\nFirebase, Firebase ML only downloads the model when it's needed and\nautomatically updates your users with the latest version.\n\n\u003cbr /\u003e\n\nReady to get started? Choose your platform:\n\n[iOS+](/docs/ml/ios/use-custom-models)\n[Android](/docs/ml/android/use-custom-models)\n\n\u003cbr /\u003e\n\n| This is a beta release of Firebase ML. This API might be changed in backward-incompatible ways and is not subject to any SLA or deprecation policy.\n\nKey capabilities\n----------------\n\n|----------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| TensorFlow Lite model deployment | Deploy your models using Firebase to reduce your app's binary size and to make sure your app is always using the most recent version available of your model |\n| On-device ML inference | Perform inference in an Apple or Android app using the TensorFlow Lite interpreter with your model. |\n| Automatic model updates | Configure the conditions under which your app automatically downloads new versions of your model: when the user's device is idle, is charging, or has a Wi-Fi connection |\n\nImplementation path\n-------------------\n\n|---|---------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| | **Train your TensorFlow model** | Build and train a custom model using TensorFlow. Or, re-train an existing model that solves a problem similar to what you want to achieve. |\n| | **Convert the model to TensorFlow Lite** | Convert your model from HDF5 or frozen graph format to TensorFlow Lite using the [TensorFlow Lite converter](https://www.tensorflow.org/lite/convert). |\n| | **Deploy your TensorFlow Lite model to Firebase** | Optional: When you deploy your TensorFlow Lite model to Firebase and include the Firebase ML SDK in your app, Firebase ML keeps your users up to date with the latest version of your model. You can configure it to automatically download model updates when the user's device is idle or charging, or has a Wi-Fi connection. |\n| | **Use the TensorFlow Lite model for inference** | Use the TensorFlow Lite interpreter in your Apple or Android app to perform inference with models deployed using Firebase. |\n\nCodelabs\n--------\n\nTry some [codelabs](/docs/ml/codelabs) to learn hands-on how Firebase can help you use\nTensorFlow Lite models more easily and effectively."]]