部署本地训练模型是一个受支持的用例;在instructions基本上是一样的,不管你在那里受过训练的吧:
要部署你需要一个模型版本:
一个TensorFlow SavedModel保存在谷歌云存储。你可以得到一个 模型:
继云ML引擎的训练步骤在 云培训。
在其他地方培训并导出到SavedModel。
不幸的是,TensorFlow for Poets不显示如何导出SavedModel(我已经提交了功能要求,以解决)。在此期间,你可以写一个“转换器”脚本如下内容(你可以交替做这个在训练结束,而不是节约出来graph.pb
和阅读它的回):
基于
input_graph = 'graph.pb'
saved_model_dir = 'my_model'
with tf.Graph() as graph:
# Read in the export graph
with tf.gfile.FastGFile(input_graph, 'rb') as f:
graph_def = tf.GraphDef()
graph_def.ParseFromString(f.read())
tf.import_graph_def(graph_def, name='')
# CloudML Engine and early versions of TensorFlow Serving do
# not currently support graphs without variables. Add a
# prosthetic variable.
dummy_var = tf.Variable(0)
# Define SavedModel Signature (inputs and outputs)
in_image = graph.get_tensor_by_name('DecodeJpeg/contents:0')
inputs = {'image_bytes':
tf.saved_model.utils.build_tensor_info(in_image)}
out_classes = graph.get_tensor_by_name('final_result:0')
outputs = {'prediction': tf.saved_model.utils.build_tensor_info(out_classes)}
signature = tf.saved_model.signature_def_utils.build_signature_def(
inputs=inputs,
outputs=outputs,
method_name='tensorflow/serving/predict'
)
# Save out the SavedModel.
b = saved_model_builder.SavedModelBuilder(saved_model_dir)
b.add_meta_graph_and_variables(sess,
[tf.saved_model.tag_constants.SERVING],
signature_def_map={'predict_images': signature})
b.save()
(未经测试的代码this codelab和this SO post)。
如果你想输出使用字符串标签,而不是整数索引,做如下改变:
# Loads label file, strips off carriage return
label_lines = [line.rstrip() for line
in tf.gfile.GFile("retrained_labels.txt")]
out_classes = graph.get_tensor_by_name('final_result:0')
out_labels = tf.gather(label_lines, ot_classes)
outputs = {'prediction': tf.saved_model.utils.build_tensor_info(out_labels)}