2017-08-02 92 views
0

我想用落后的累积和功能:Tensorflow - 用张量作为指标

def _backwards_cumsum(x, length, batch_size): 

upper_triangular_ones = np.float32(np.triu(np.ones((length, length)))) 
repeated_tri = np.float32(np.kron(np.eye(batch_size), upper_triangular_ones)) 
return tf.matmul(repeated_tri, 
        tf.reshape(x, [length, 1])) 

但是长度是一个占位符:

length = tf.placeholder("int32" ,name = 'xx') 

所以每次它得到一个新的价值然后开始计算_backwards_cumsum。

一旦尝试运行的功能,我得到了一个错误:

TypeError: 'Tensor' object cannot be interpreted as an index 

完整回溯:

{ 
TypeError         Traceback (most recent call last) 
<ipython-input-561-970ae9e96aa1> in <module>() 
----> 1 rewards = _backwards_cumsum(tf.reshape(tf.reshape(decays,[-1,1]) * tf.sigmoid(disc_pred_gen_ph), [-1]), _maxx, batch_size) 

<ipython-input-546-5c6928fac357> in _backwards_cumsum(x, length, batch_size) 
     1 def _backwards_cumsum(x, length, batch_size): 
     2 
----> 3  upper_triangular_ones = np.float32(np.triu(np.ones((length, length)))) 
     4  repeated_tri = np.float32(np.kron(np.eye(batch_size), upper_triangular_ones)) 
     5  return tf.matmul(repeated_tri, 

/Users/onivron/anaconda/envs/tensorflow/lib/python2.7/site-packages/numpy/core/numeric.pyc in ones(shape, dtype, order) 
    190 
    191  """ 
--> 192  a = empty(shape, dtype, order) 
    193  multiarray.copyto(a, 1, casting='unsafe') 
    194  return a 

哪里_maxx是与上面相同长度的占位符。

任何解决方法呢?

+0

如果没有完整的回溯,很难说清楚。它是Python解释器错误? TensorFlow运行时错误? etc –

回答

1

该错误与您在不知情情况下用于numpy数组的张量对象有关:length。在tensorflow中使用numpy功能的最好方法是使用tf.py_func

# Define a new function that only depends on numpy/any non tensorflow graph object 

def get_repeated_tri(length, batch_size): 
    upper_triangular_ones = np.float32(np.triu(np.ones((length, length)))) 
    repeated_tri = np.float32(np.kron(np.eye(batch_size), upper_triangular_ones)) 
    return repeated_tri 
# Here length and batch size must be non tensor object 
repeated_tri = tf.py_func(get_repeated_tri, [length, batch_size], tf.int32) 
# there're some size mismacthes also in your code `tf.matmul` 
def _backwards_cumsum(repeated_tri, x, length_, batch_size): 
    return tf.matmul(repeated_tri, tf.reshape(x, [length_*batch_size, -1])) 
length_ = tf.placeholder(tf.int32, name='length') 
# also define length, batch_size as nump constants 
# x as tensorflow tensor 
some_tensor_out= _backwards_cumsum(repeated_tri, x, length_, batch_size) 

some_tensor_out_ = sess.run(some_tensor_out, {length_:length}) 
+0

主要问题是,我想有不同的长度影响repeated_tri。 – moreo

+0

那么是什么问题,这段代码支持不同的长度。 –