2017-03-16 69 views
0

我从here阅读建议始终使用tf.get_variable(...)虽然这似乎有点麻烦,当我试图实现一个网络。我正确使用tf.get_variable()吗?

例如:

def create_weights(shape, name = 'weights',\ 
        initializer = tf.random_normal_initializer(0, 0.1)): 
    weights = tf.get_variable(name, shape, initializer = initializer) 
    print("weights created named: {}".format(weights.name)) 
    return(weights) 

def LeNet(in_units, keep_prob): 

    # define the network 
    with tf.variable_scope("conv1"): 
     conv1 = conv(in_units, create_weights([5, 5, 3, 32]), create_bias([32])) 
     pool1 = maxpool(conv1) 

    with tf.variable_scope("conv2"): 
     conv2 = conv(pool1, create_weights([5, 5, 32, 64]), create_bias([64])) 
     pool2 = maxpool(conv2) 

    # reshape the network to feed it into the fully connected layers 
    with tf.variable_scope("flatten"): 
     flatten = tf.reshape(pool2, [-1, 1600]) 
     flatten = dropout(flatten, keep_prob) 

    with tf.variable_scope("fc1"): 
     fc1 = fc(flatten, create_weights([1600, 120]), biases = create_bias([120])) 
     fc1 = dropout(fc1, keep_prob) 

    with tf.variable_scope("fc2"): 
     fc2 = fc(fc1, create_weights([120, 84]), biases = create_bias([84])) 

    with tf.variable_scope("logits"): 
     logits = fc(fc2, create_weights([84, 43]), biases = create_bias([43])) 

    return(logits) 

我不得不使用with tf_variable_scope(...)每一次我打电话create_weights,而且,说如果我想改变conv1变量的权重[7, 7, 3, 32]代替[5, 5, 3, 32]我将不得不重新启动内核作为变量已经存在。另一方面,如果我使用tf.Variable(...)我不会有任何这些问题。

我使用tf.variable_scope(...)不正确?

回答

0

看来你不能改变已经存在的变量范围,因此只有当您重新启动的内核,你可以改变你之前定义的变量。(实际上你创建一个新的,因为前一个已删除)

...

,这只是我的猜测......我会很感激,如果有人可以给一个详细的解答。