您可以使用Firebase控制台或Firebase Admin Python和Node.js SDK部署和管理自定义模型和经过AutoML训练的模型。如果您只想部署模型并偶尔进行更新,则使用Firebase控制台通常最简单。在与构建管道集成,与Colab或Jupyter笔记本以及其他工作流程集成时,Admin SDK可能会有所帮助。
在Firebase控制台中部署和管理模型
TensorFlow Lite模型
要使用Firebase控制台部署TensorFlow Lite模型,请执行以下操作:
- 在Firebase控制台中打开Firebase ML自定义模型页面。
- 点击添加自定义模型(或添加其他模型)。
- 指定将用于在Firebase项目中标识模型的名称,然后上传TensorFlow Lite模型文件(通常以
.tflite
或.lite
)。
部署模型后,可以在“自定义”页面上找到它。从那里,您可以完成任务,例如用新文件更新模型,下载模型以及从项目中删除模型。
使用Firebase Admin SDK部署和管理模型
本节说明如何使用Admin SDK完成常见的模型部署和管理任务。有关其他帮助,请参见Python或Node.js的SDK参考。
有关使用中的SDK的示例,请参见Python快速入门示例和Node.js快速入门示例。
在你开始之前
如果您还没有Firebase项目,请在Firebase控制台中创建一个新项目。然后,打开您的项目并执行以下操作:
在“设置”页面上,创建服务帐户并下载服务帐户密钥文件。确保此文件安全,因为它授予管理员对您的项目的访问权限。
在“存储”页面上,启用云存储。记下您的存储桶名称。
在将模型文件添加到Firebase项目中时,您需要一个存储桶来临时存储模型文件。如果您使用的是Blaze计划,则可以为此目的创建和使用默认存储桶以外的存储桶。
在Firebase ML页面上,如果尚未启用Firebase ML,请单击“入门” 。
在Google API控制台中,打开Firebase项目并启用Firebase ML API。
初始化SDK时,请指定服务帐户凭据和要用于存储模型的存储桶:
蟒蛇
import firebase_admin from firebase_admin import ml from firebase_admin import credentials firebase_admin.initialize_app( credentials.Certificate('/path/to/your/service_account_key.json'), options={ 'storageBucket': 'your-storage-bucket', })
Node.js
const admin = require('firebase-admin'); const serviceAccount = require('/path/to/your/service_account_key.json'); admin.initializeApp({ credential: admin.credential.cert(serviceAccount), storageBucket: 'your-storage-bucket', }); const ml = admin.machineLearning();
部署模型
TensorFlow Lite文件
要从模型文件部署TensorFlow Lite模型,请将其上传到您的项目中,然后发布:
蟒蛇
# First, import and initialize the SDK as shown above.
# Load a tflite file and upload it to Cloud Storage
source = ml.TFLiteGCSModelSource.from_tflite_model_file('example.tflite')
# Create the model object
tflite_format = ml.TFLiteFormat(model_source=source)
model = ml.Model(
display_name="example_model", # This is the name you use from your app to load the model.
tags=["examples"], # Optional tags for easier management.
model_format=tflite_format)
# Add the model to your Firebase project and publish it
new_model = ml.create_model(model)
ml.publish_model(new_model.model_id)
Node.js
// First, import and initialize the SDK as shown above.
(async () => {
// Upload the tflite file to Cloud Storage
const storageBucket = admin.storage().bucket('your-storage-bucket');
const files = await storageBucket.upload('./example.tflite');
// Create the model object and add the model to your Firebase project.
const bucket = files[0].metadata.bucket;
const name = files[0].metadata.name;
const gcsUri = `gs:/⁠/${bucket}/${name}`;
const model = await ml.createModel({
displayName: 'example_model', // This is the name you use from your app to load the model.
tags: ['examples'], // Optional tags for easier management.
tfliteModel: { gcsTfliteUri: gcsUri },
});
// Publish the model.
await ml.publishModel(model.modelId);
process.exit();
})().catch(console.error);
TensorFlow和Keras模型
使用Python SDK,您可以一步将模型从TensorFlow保存的模型格式转换为TensorFlow Lite并将其上传到Cloud Storage存储桶。然后,以与部署TensorFlow Lite文件相同的方式进行部署。
蟒蛇
# First, import and initialize the SDK as shown above.
# Convert the model to TensorFlow Lite and upload it to Cloud Storage
source = ml.TFLiteGCSModelSource.from_saved_model('./model_directory')
# Create the model object
tflite_format = ml.TFLiteFormat(model_source=source)
model = ml.Model(
display_name="example_model", # This is the name you use from your app to load the model.
tags=["examples"], # Optional tags for easier management.
model_format=tflite_format)
# Add the model to your Firebase project and publish it
new_model = ml.create_model(model)
ml.publish_model(new_model.model_id)
如果您拥有Keras模型,则还可以将其转换为TensorFlow Lite并一步上传。您可以使用保存到HDF5文件的Keras模型:
蟒蛇
import tensorflow as tf
# Load a Keras model, convert it to TensorFlow Lite, and upload it to Cloud
# Storage
model = tf.keras.models.load_model('your_model.h5')
source = ml.TFLiteGCSModelSource.from_keras_model(model)
# Create the model object, add the model to your project, and publish it. (See
# above.)
# ...
或者,您可以直接从训练脚本转换并上传Keras模型:
蟒蛇
import tensorflow as tf
# Create a simple Keras model.
x = [-1, 0, 1, 2, 3, 4]
y = [-3, -1, 1, 3, 5, 7]
model = tf.keras.models.Sequential(
[tf.keras.layers.Dense(units=1, input_shape=[1])])
model.compile(optimizer='sgd', loss='mean_squared_error')
model.fit(x, y, epochs=3)
# Convert the model to TensorFlow Lite and upload it to Cloud Storage
source = ml.TFLiteGCSModelSource.from_keras_model(model)
# Create the model object, add the model to your project, and publish it. (See
# above.)
# ...
AutoML TensorFlow Lite模型
如果您使用AutoML Cloud API或Google Cloud控制台UI训练了Edge模型,则可以使用Admin SDK将模型部署到Firebase。
您将需要指定模型的资源标识符,该标识符是类似于以下示例的字符串:
projects/PROJECT_NUMBER/locations/STORAGE_LOCATION/models/MODEL_ID
PROJECT_NUMBER | 包含模型的Cloud Storage存储桶的项目编号。这可能是您的Firebase项目或另一个Google Cloud项目。您可以在Firebase控制台或Google Cloud控制台仪表板的“设置”页面上找到该值。 |
STORAGE_LOCATION | 包含模型的Cloud Storage存储桶的资源位置。此值始终是us-central1 。 |
MODEL_ID | 您从AutoML Cloud API获得的模型ID。 |
蟒蛇
# First, import and initialize the SDK as shown above.
# Get a reference to the AutoML model
source = ml.TFLiteAutoMlSource('projects/{}/locations/{}/models/{}'.format(
# See above for information on these values.
project_number,
storage_location,
model_id
))
# Create the model object
tflite_format = ml.TFLiteFormat(model_source=source)
model = ml.Model(
display_name="example_model", # This is the name you will use from your app to load the model.
tags=["examples"], # Optional tags for easier management.
model_format=tflite_format)
# Add the model to your Firebase project and publish it
new_model = ml.create_model(model)
new_model.wait_for_unlocked()
ml.publish_model(new_model.model_id)
Node.js
// First, import and initialize the SDK as shown above.
(async () => {
// Get a reference to the AutoML model. See above for information on these
// values.
const automlModel = `projects/${projectNumber}/locations/${storageLocation}/models/${modelId}`;
// Create the model object and add the model to your Firebase project.
const model = await ml.createModel({
displayName: 'example_model', // This is the name you use from your app to load the model.
tags: ['examples'], // Optional tags for easier management.
tfliteModel: { automlModel: automlModel },
});
// Wait for the model to be ready.
await model.waitForUnlocked();
// Publish the model.
await ml.publishModel(model.modelId);
process.exit();
})().catch(console.error);
列出您的项目模型
您可以列出项目的模型,可以选择过滤结果:
蟒蛇
# First, import and initialize the SDK as shown above.
face_detectors = ml.list_models(list_filter="tags: face_detector").iterate_all()
print("Face detection models:")
for model in face_detectors:
print('{} (ID: {})'.format(model.display_name, model.model_id))
Node.js
08十年10您可以按以下字段进行过滤:
领域 | 例子 |
---|---|
display_name | display_name = example_model display_name != example_model 所有带有 display_name : experimental_* 请注意,仅支持前缀匹配。 |
tags | tags: face_detector tags: face_detector AND tags: experimental |
state.published | state.published = true state.published = false |
将过滤器与AND
, OR
和NOT
运算符和括号( (
, )
)结合使用。
更新模型
将模型添加到项目后,可以更新其显示名称,标签和tflite
模型文件:
蟒蛇
# First, import and initialize the SDK as shown above.
model = ... # Model object from create_model(), get_model(), or list_models()
# Update the model with a new tflite model. (You could also update with a
# `TFLiteAutoMlSource`)
source = ml.TFLiteGCSModelSource.from_tflite_model_file('example_v2.tflite')
model.model_format = ml.TFLiteFormat(model_source=source)
# Update the model's display name.
model.display_name = "example_model"
# Update the model's tags.
model.tags = ["examples", "new_models"]
# Add a new tag.
model.tags += "experimental"
# After you change the fields you want to update, save the model changes to
# Firebase and publish it.
updated_model = ml.update_model(model)
ml.publish_model(updated_model.model_id)
Node.js
// First, import and initialize the SDK as shown above.
(async () => {
const model = ... // Model object from createModel(), getModel(), or listModels()
// Upload a new tflite file to Cloud Storage.
const files = await storageBucket.upload('./example_v2.tflite');
const bucket = files[0].metadata.bucket;
const name = files[0].metadata.name;
// Update the model. Any fields you omit will be unchanged.
await ml.updateModel(model.modelId, {
displayName: 'example_model', // Update the model's display name.
tags: model.tags.concat(['new']), // Add a tag.
tfliteModel: {gcsTfliteUri: `gs:/⁠/${bucket}/${name}`},
});
process.exit();
})().catch(console.error);
取消发布或删除模型
要取消发布或删除模型,请将模型ID传递给取消发布或删除方法。取消发布模型时,它仍保留在您的项目中,但您的应用程序无法下载。删除模型后,它会从项目中完全删除。 (在标准工作流程中不希望取消发布模型,但是您可以使用它来立即取消发布您意外发布并且尚未在任何地方使用的新模型,或者在用户下载“不良”文件的情况更糟的情况下而不是获取模型未找到的错误。)
如果您仍然没有对Model对象的引用,则可能需要通过使用过滤器列出项目的模型来获取模型ID。例如,要删除所有标记为“ face_detector”的模型:
蟒蛇
# First, import and initialize the SDK as shown above.
face_detectors = ml.list_models(list_filter="tags: 'face_detector'").iterate_all()
for model in face_detectors:
ml.delete_model(model.model_id)
Node.js
// First, import and initialize the SDK as shown above.
(async () => {
let listOptions = {filter: 'tags: face_detector'}
let models;
let pageToken = null;
do {
if (pageToken) listOptions.pageToken = pageToken;
({models, pageToken} = await ml.listModels(listOptions));
for (const model of models) {
await ml.deleteModel(model.modelId);
}
} while (pageToken != null);
process.exit();
})().catch(console.error);