2017-10-19 241 views
0

可以优雅地做到这一点吗?写入和读取tfrecord文件中的SparseTensor

现在我唯一能想到的就是将SparseTensor的索引(tf.int64),值(tf.float32)和形状(tf.int64)保存在3个独立的功能中(前两个是VarLenFeature最后一个是FixedLenFeature)。这看起来很麻烦。

任何意见是赞赏!

更新1

下面我的回答是不适合用于构建计算图(B/C稀疏张量中的内容经由sess.run()中,如果调用花费了大量的时间将被提取)

mrry's answer的启发,我想也许我们可以得到由tf.serialize_sparse生成的字节,以便我们以后可以使用tf.deserialize_many_sparse恢复SparseTensor。但是tf.serialize_sparse没有在纯python中实现(它调用外部函数SerializeSparse),这意味着我们仍然需要使用sess.run()来获取字节。我怎样才能得到一个纯粹的Python版本SerializeSparse?谢谢。

回答

1

由于Tensorflow当前仅支持tfrecord中的3种类型:Float,Int64和Bytes,并且SparseTensor通常具有多于一种类型,所以我的解决方案是将SparseTensor转换为带有Pickle的Bytes。

这里是一个示例代码:

import tensorflow as tf 
import pickle 
import numpy as np 
from scipy.sparse import csr_matrix 

#---------------------------------# 
# Write to a tfrecord file 

# create two sparse matrices (simulate the values from .eval() of SparseTensor) 
a = csr_matrix(np.arange(12).reshape((4,3))) 
b = csr_matrix(np.random.rand(20).reshape((5,4))) 

# convert them to pickle bytes 
p_a = pickle.dumps(a) 
p_b = pickle.dumps(b) 

# put the bytes in context_list and feature_list 
## save p_a in context_lists 
context_lists = tf.train.Features(feature={ 
    'context_a': tf.train.Feature(bytes_list=tf.train.BytesList(value=[p_a])) 
    }) 
## save p_b as a one element sequence in feature_lists 
p_b_features = [tf.train.Feature(bytes_list=tf.train.BytesList(value=[p_b]))] 
feature_lists = tf.train.FeatureLists(feature_list={ 
    'features_b': tf.train.FeatureList(feature=p_b_features) 
    }) 

# create the SequenceExample 
SeqEx = tf.train.SequenceExample(
    context = context_lists, 
    feature_lists = feature_lists 
    ) 
SeqEx_serialized = SeqEx.SerializeToString() 

# write to a tfrecord file 
tf_FWN = 'test_pickle1.tfrecord' 
tf_writer1 = tf.python_io.TFRecordWriter(tf_FWN) 
tf_writer1.write(SeqEx_serialized) 
tf_writer1.close() 

#---------------------------------# 
# Read from the tfrecord file 

# first, define the parse function 
def _parse_SE_test_pickle1(in_example_proto): 
    context_features = { 
     'context_a': tf.FixedLenFeature([], dtype=tf.string) 
     } 
    sequence_features = { 
     'features_b': tf.FixedLenSequenceFeature([1], dtype=tf.string) 
     } 
    context, sequence = tf.parse_single_sequence_example(
     in_example_proto, 
     context_features=context_features, 
     sequence_features=sequence_features 
    ) 
    p_a_tf = context['context_a'] 
    p_b_tf = sequence['features_b'] 

    return tf.tuple([p_a_tf, p_b_tf]) 

# use the Dataset API to read 
dataset = tf.data.TFRecordDataset(tf_FWN) 
dataset = dataset.map(_parse_SE_test_pickle1) 
dataset = dataset.batch(1) 
iterator = dataset.make_initializable_iterator() 
next_element = iterator.get_next() 

sess = tf.InteractiveSession() 
sess.run(tf.global_variables_initializer()) 
sess.run(iterator.initializer) 

[p_a_bat, p_b_bat] = sess.run(next_element) 

# 1st index refers to batch, 2nd and 3rd indices refers to the sequence position (only for b) 
rec_a = pickle.loads(p_a_bat[0]) 
rec_b = pickle.loads(p_b_bat[0][0][0]) 

# check whether the recovered the same as the original ones. 
assert((rec_a - a).nnz == 0) 
assert((rec_b - b).nnz == 0) 

# print the contents 
print("\n------ a -------") 
print(a.todense()) 
print("\n------ rec_a -------") 
print(rec_a.todense()) 
print("\n------ b -------") 
print(b.todense()) 
print("\n------ rec_b -------") 
print(rec_b.todense()) 

这里是我的了:

------ a ------- 
[[ 0 1 2] 
[ 3 4 5] 
[ 6 7 8] 
[ 9 10 11]] 

------ rec_a ------- 
[[ 0 1 2] 
[ 3 4 5] 
[ 6 7 8] 
[ 9 10 11]] 

------ b ------- 
[[ 0.88612402 0.51438017 0.20077887 0.20969243] 
[ 0.41762425 0.47394715 0.35596051 0.96074408] 
[ 0.35491739 0.0761953 0.86217511 0.45796474] 
[ 0.81253723 0.57032448 0.94959189 0.10139615] 
[ 0.92177499 0.83519464 0.96679833 0.41397829]] 

------ rec_b ------- 
[[ 0.88612402 0.51438017 0.20077887 0.20969243] 
[ 0.41762425 0.47394715 0.35596051 0.96074408] 
[ 0.35491739 0.0761953 0.86217511 0.45796474] 
[ 0.81253723 0.57032448 0.94959189 0.10139615] 
[ 0.92177499 0.83519464 0.96679833 0.41397829]]