2017-06-21 1896 views
3

我需要获取随时间推移的损失历史记录,并将其绘制在图表中。 这里是我的代码框架:如何使用tf.contrib.opt.ScipyOptimizerInterface获取损失函数历史记录

optimizer = tf.contrib.opt.ScipyOptimizerInterface(loss, method='L-BFGS-B', 
options={'maxiter': args.max_iterations, 'disp': print_iterations}) 
optimizer.minimize(sess, loss_callback=append_loss_history) 

随着append_loss_history定义:

def append_loss_history(**kwargs): 
    global step 
    if step % 50 == 0: 
     loss_history.append(loss.eval()) 
    step += 1 

当我看到的ScipyOptimizerInterface的详细输出,损失实际上是随着时间的推移减少。 但是,当我打印loss_history,随着时间的推移损失几乎相同。

请参阅文档: “优化后的变量将在原地更新” https://www.tensorflow.org/api_docs/python/tf/contrib/opt/ScipyOptimizerInterface。这是损失不变的原因吗?

回答

2

我认为你有问题了;直到优化结束时才会修改变量本身(而不是being fed to session.run calls),并且评估“反向通道”张量得到未修改的变量。相反,使用fetches参数optimizer.minimize捎带上有规定,原料中的session.run电话:

import tensorflow as tf 

def print_loss(loss_evaled, vector_evaled): 
    print(loss_evaled, vector_evaled) 

vector = tf.Variable([7., 7.], 'vector') 
loss = tf.reduce_sum(tf.square(vector)) 

optimizer = tf.contrib.opt.ScipyOptimizerInterface(
    loss, method='L-BFGS-B', 
    options={'maxiter': 100}) 

with tf.Session() as session: 
    tf.global_variables_initializer().run() 
    optimizer.minimize(session, 
        loss_callback=print_loss, 
        fetches=[loss, vector]) 
    print(vector.eval()) 

(从example in the documentation修改)。这将更新值打印张量:

98.0 [ 7. 7.] 
79.201 [ 6.29289341 6.29289341] 
7.14396e-12 [ -1.88996808e-06 -1.88996808e-06] 
[ -1.88996808e-06 -1.88996808e-06] 
相关问题