透過 Flutter 使用自訂 TensorFlow Lite 模型
透過集合功能整理內容
你可以依據偏好儲存及分類內容。
如果您的應用程式使用自訂
TensorFlow Lite 模型
如何運用 Firebase ML 部署模型透過 Firebase 部署模型後
可縮減應用程式的初始下載大小,並更新應用程式的機器學習模型
不必發布新版應用程式然後利用遠端設定和 A/B 功能
透過測試,您可以針對不同的使用者群組動態提供不同的模型。
TensorFlow Lite 模型
TensorFlow Lite 模型是最佳化的機器學習模型,適合在行動裝置上執行
裝置。如要取得 TensorFlow Lite 模型,請按照下列步驟操作:
請注意,如果沒有供 Dart 維護的 TensorFlow Lite 程式庫,
則需要與原生 TensorFlow Lite 程式庫整合
平台。這裡並未記錄這項整合。
事前準備
安裝並初始化 Flutter 適用的 Firebase SDK
如果尚未建立
在 Flutter 專案的根目錄中執行下列指令
安裝機器學習模型下載工具外掛程式的指令:
flutter pub add firebase_ml_model_downloader
重新建構您的專案:
flutter run
1. 部署模型
您可以使用 Firebase 控制台或
Firebase Admin Python 和 Node.js SDK詳情請見
部署及管理自訂模型。
在 Firebase 專案中加入自訂模型後,您就能參照
在應用程式中使用您指定的名稱您隨時可以將
新的 TensorFlow Lite 模型,並將新模型下載至使用者的裝置劃分依據:
呼叫 getModel()
(請見下方)。
2. 將模型下載至裝置,然後初始化 TensorFlow Lite 解譯器
如要在應用程式中使用 TensorFlow Lite 模型,請先使用模型下載工具
下載最新版本的模型到裝置。接著,將
TensorFlow Lite 解譯器搭配模型。
如要開始下載模型,請呼叫模型下載工具的 getModel()
方法。
指定您在上傳模型時為模型指派的名稱
因此建議您一律下載最新模型
需要允許下載
有三種下載行為可供選擇:
下載類型 |
說明 |
localModel
|
取得裝置的本機模型。
如果沒有本機模型可用,這個
的運作方式與 latestModel 類似。使用這份草稿
然後下載
檢查模型更新狀態例如:
就是使用遠端設定
而且每次都會將模型上傳到
的新名稱 (建議)。 |
localModelUpdateInBackground
|
取得裝置的本機模型,並
開始在背景更新模型
如果沒有本機模型可用,這個
的運作方式與 latestModel 類似。 |
latestModel
|
取得最新模型。如果本機模型
最新版本,會傳回本機
模型否則,請下載最新版本
模型這項行為會封鎖,直到
就會下載最新版本
建議)。這個行為僅限用於
明確需要
版本。 |
您應該停用模型相關功能,例如
隱藏部分使用者介面,直到您確認下載模型為止。
FirebaseModelDownloader.instance
.getModel(
"yourModelName",
FirebaseModelDownloadType.localModel,
FirebaseModelDownloadConditions(
iosAllowsCellularAccess: true,
iosAllowsBackgroundDownloading: false,
androidChargingRequired: false,
androidWifiRequired: false,
androidDeviceIdleRequired: false,
)
)
.then((customModel) {
// Download complete. Depending on your app, you could enable the ML
// feature, or switch from the local model to the remote model, etc.
// The CustomModel object contains the local path of the model file,
// which you can use to instantiate a TensorFlow Lite interpreter.
final localModelPath = customModel.file;
// ...
});
許多應用程式會在初始化程式碼中啟動下載工作,但您可以這麼做
因此在您需要使用模型前
將模型檔案儲存在裝置上後,就能將模型檔案與
TensorFlow Lite 解譯器來執行推論。在沒有維護
適用於 Dart 的 TensorFlow Lite 程式庫,您需要與
原生 TensorFlow Lite 程式庫
分為 iOS 和 Android 版
附錄:模型安全性
無論您是以何種方式將 TensorFlow Lite 模型提供給
Firebase ML、Firebase 機器學習會以標準的序列化 protobuf 格式儲存檔案,
本機儲存空間
理論上,這代表任何人都可以複製您的模型不過
實務上,大多數模型 都特別適合應用程式,並經過模糊處理
並預測其他可能與競爭對手拆解及
重複使用程式碼然而,在使用 之前,請務必先瞭解此風險
在應用程式中加入自訂模型
除非另有註明,否則本頁面中的內容是採用創用 CC 姓名標示 4.0 授權,程式碼範例則為阿帕契 2.0 授權。詳情請參閱《Google Developers 網站政策》。Java 是 Oracle 和/或其關聯企業的註冊商標。
上次更新時間:2025-07-25 (世界標準時間)。
[null,null,["上次更新時間:2025-07-25 (世界標準時間)。"],[],[],null,["# Use a custom TensorFlow Lite model with Flutter\n\n\u003cbr /\u003e\n\nIf your app uses custom\n[TensorFlow Lite](https://www.tensorflow.org/lite/) models, you can\nuse Firebase ML to deploy your models. By deploying models with Firebase, you\ncan reduce the initial download size of your app and update your app's ML models\nwithout releasing a new version of your app. And, with Remote Config and A/B\nTesting, you can dynamically serve different models to different sets of users.\n\nTensorFlow Lite models\n----------------------\n\nTensorFlow Lite models are ML models that are optimized to run on mobile\ndevices. To get a TensorFlow Lite model:\n\n- Use a pre-built model, such as one of the [official TensorFlow Lite models](https://www.tensorflow.org/lite/models)\n- [Convert a TensorFlow model, Keras model, or concrete function to TensorFlow Lite.](https://www.tensorflow.org/lite/convert)\n\nNote that in the absence of a maintained TensorFlow Lite library for Dart, you\nwill need to integrate with the native TensorFlow Lite library for your\nplatforms. This integration is not documented here.\n\nBefore you begin\n----------------\n\n1. [Install and initialize the Firebase SDKs for Flutter](/docs/flutter/setup)\n if you haven't already done so.\n\n2. From the root directory of your Flutter project, run the following\n command to install the ML model downloader plugin:\n\n flutter pub add firebase_ml_model_downloader\n\n3. Rebuild your project:\n\n flutter run\n\n1. Deploy your model\n--------------------\n\nDeploy your custom TensorFlow models using either the Firebase console or\nthe Firebase Admin Python and Node.js SDKs. See\n[Deploy and manage custom models](/docs/ml/manage-hosted-models).\n\nAfter you add a custom model to your Firebase project, you can reference the\nmodel in your apps using the name you specified. At any time, you can deploy a\nnew TensorFlow Lite model and download the new model onto users' devices by\ncalling `getModel()` (see below).\n\n2. Download the model to the device and initialize a TensorFlow Lite interpreter\n--------------------------------------------------------------------------------\n\nTo use your TensorFlow Lite model in your app, first use the model downloader\nto download the latest version of the model to the device. Then, instantiate a\nTensorFlow Lite interpreter with the model.\n\nTo start the model download, call the model downloader's `getModel()` method,\nspecifying the name you assigned the model when you uploaded it, whether you\nwant to always download the latest model, and the conditions under which you\nwant to allow downloading.\n\nYou can choose from three download behaviors:\n\n| Download type | Description |\n|--------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `localModel` | Get the local model from the device. If there is no local model available, this behaves like `latestModel`. Use this download type if you are not interested in checking for model updates. For example, you're using Remote Config to retrieve model names and you always upload models under new names (recommended). |\n| `localModelUpdateInBackground` | Get the local model from the device and start updating the model in the background. If there is no local model available, this behaves like `latestModel`. |\n| `latestModel` | Get the latest model. If the local model is the latest version, returns the local model. Otherwise, download the latest model. This behavior will block until the latest version is downloaded (not recommended). Use this behavior only in cases where you explicitly need the latest version. |\n\nYou should disable model-related functionality---for example, grey-out or\nhide part of your UI---until you confirm the model has been downloaded. \n\n FirebaseModelDownloader.instance\n .getModel(\n \"yourModelName\",\n FirebaseModelDownloadType.localModel,\n FirebaseModelDownloadConditions(\n iosAllowsCellularAccess: true,\n iosAllowsBackgroundDownloading: false,\n androidChargingRequired: false,\n androidWifiRequired: false,\n androidDeviceIdleRequired: false,\n )\n )\n .then((customModel) {\n // Download complete. Depending on your app, you could enable the ML\n // feature, or switch from the local model to the remote model, etc.\n\n // The CustomModel object contains the local path of the model file,\n // which you can use to instantiate a TensorFlow Lite interpreter.\n final localModelPath = customModel.file;\n\n // ...\n });\n\nMany apps start the download task in their initialization code, but you can do\nso at any point before you need to use the model.\n\n3. Perform inference on input data\n----------------------------------\n\nNow that you have your model file on the device you can use it with the\nTensorFlow Lite interpreter to perform inference. In the absence of a maintained\nTensorFlow Lite library for Dart, you will need to integrate with the\n[native TensorFlow Lite libraries](https://www.tensorflow.org/lite)\nfor iOS and Android.\n\nAppendix: Model security\n------------------------\n\nRegardless of how you make your TensorFlow Lite models available to\nFirebase ML, Firebase ML stores them in the standard serialized protobuf format in\nlocal storage.\n\nIn theory, this means that anybody can copy your model. However,\nin practice, most models are so application-specific and obfuscated by\noptimizations that the risk is similar to that of competitors disassembling and\nreusing your code. Nevertheless, you should be aware of this risk before you use\na custom model in your app."]]