2017-10-07 95 views
0

我使用一个图形文件(PB文件)的,这Tensorflow模型的目的是提供某些图像上的预测Tensorflow:获得预测出图形文件(.pb文件)

我已经开发出一种码该加载图形文件,但我不能统计会话。 提供的文件是: -

  • training_model_saved_model.pb
  • 变量
    • training_model_variables_variables.data 00000-的-00001
    • training_model_variables_variables.index

的输出错误包含很大模型的列表layer.what我可以在这种情况下做的,任何帮助表示赞赏

我用来加载/运行模型的代码

import tensorflow as tf 
import sys 
import os 



import matplotlib.image as mpimg 
import matplotlib.pyplot as plt 


from tensorflow.core.protobuf import saved_model_pb2 
from tensorflow.python.util import compat 
from tensorflow.python.platform import gfile 

export_dir = os.path.join("./", "variables/") 
filename = "imgpsh_fullsize.jpeg" 
raw_image_data = mpimg.imread(filename) 

g = tf.Graph() 
with tf.Session(graph=g) as sess: 
    model_filename ='training_model_saved_model.pb' 
    with gfile.FastGFile(model_filename, 'rb') as f: 

     data = compat.as_bytes(f.read()) 
     sm = saved_model_pb2.SavedModel() 
     sm.ParseFromString(data) 
     #print(sm) 
     if 1 != len(sm.meta_graphs): 
       print('More than one graph found. Not sure which to write') 
       sys.exit(1) 

     image_input= tf.import_graph_def(sm.meta_graphs[0].graph_def,name='',return_elements=["input"]) 
     #print(image_input) 
     #saver = tf.train.Saver() 
     saver = tf.train.import_meta_graph(sm.meta_graphs[0].graph_def) 
     ''' 
     print(image_input) 

     x = g.get_tensor_by_name("input:0") 

     print(x) 
     ''' 
     saver.restore(sess,model_filename) 

     predictions = sess.run(feed_dict={image: raw_image_data}) 
     print('###################################################') 
     print(predictions) 

存在错误是

Traceback (most recent call last): 
    File "model_Input-get.py", line 35, in <module> 
    saver = tf.train.import_meta_graph(sm.meta_graphs[0].graph_def) 
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/training/saver.py", line 1691, in import_meta_graph 
    meta_graph_def = meta_graph.read_meta_graph_file(meta_graph_or_file) 
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/meta_graph.py", line 553, in read_meta_graph_file 
    if not file_io.file_exists(filename): 
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/lib/io/file_io.py", line 252, in file_exists 
    pywrap_tensorflow.FileExists(compat.as_bytes(filename), status) 
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/util/compat.py", line 65, in as_bytes 
    (bytes_or_text,)) 
TypeError: Expected binary or unicode string, got node { 
    name: "input" 
    op: "Placeholder" 
    attr { 
    key: "_output_shapes" 
    value { 
     list { 
     shape { 
      dim { 
      size: -1 
      } 
     } 
     } 
    } 
    } 
    attr { 
    key: "dtype" 
    value { 
     type: DT_STRING 
    } 
    } 

回答

0

你似乎将TensorFlow Serving SavedModel格式与常规TensorFlow导出/恢复功能混合在一起。

这是TensorFlow代码库中一个特别令人困惑的部分,因为这种格式在第一次出现时没有很好的记录 - 并且没有很多例子显示何时使用这种格式与原始格式。

我的建议是要么1)切换到TF服务,并继续使用SavedModel格式或2)坚持原始的导出/恢复模型格式。

+0

我感觉迷失在恢复GraphFile。我使用的SavedModel,因为当我尝试 ' graph_def = tf.GraphDef() graph_def.ParseFromString(f.read()) G_IN = tf.import_graph_def(graph_def) ' 我得到protobuf.message.DecodeError –

+0

你有什么建议在代码中进行编辑,请你提供更多的细节。你是什​​么意思切换到TF服务,我听说bazel 如何在这里应用 –