2017-01-03 115 views
8

在Keras中运行密集的前馈神经网络。 有两个输出的class_weights,第三个输出有sample_weights。出于某种原因,它会打印每个计算批次的进度详细显示,并且不会更新与其应该在同一行上的打印... 这是否发生过? 它是如何修复的? 从贝:Keras详细的培训进度条在每个批次问题上写一个新行

42336/747322 [====>.........................] - ETA: 79s - loss: 20.7154 - x1_loss: 9.5913 - x2_loss: 10.0536 - x3_loss: 1.0705 - x1_acc: 0.6930 - x2_acc: 0.4433 - x3_acc: 0.6821 
143360/747322 [====>.........................] - ETA: 78s - loss: 20.7387 - x1_loss: 9.6131 - x2_loss: 10.0555 - x3_loss: 1.0702 - x1_acc: 0.6930 - x2_acc: 0.4432 - x3_acc: 0.6820 
144384/747322 [====>.........................] - ETA: 78s - loss: 20.7362 - x1_loss: 9.6067 - x2_loss: 10.0608 - x3_loss: 1.0687 - x1_acc: 0.6930 - x2_acc: 0.4429 - x3_acc: 0.6817 
145408/747322 [====>.........................] - ETA: 78s - loss: 20.7257 - x1_loss: 9.5985 - x2_loss: 10.0571 - x3_loss: 1.0702 - x1_acc: 0.6929 - x2_acc: 0.4428 - x3_acc: 0.6815 
146432/747322 [====>.........................] - ETA: 78s - loss: 20.7145 - x1_loss: 9.5849 - x2_loss: 10.0605 - x3_loss: 1.0691 - x1_acc: 0.6932 - x2_acc: 0.4429 - x3_acc: 0.6816 
147456/747322 [====>.........................] - ETA: 78s - loss: 20.7208 - x1_loss: 9.5859 - x2_loss: 10.0662 - x3_loss: 1.0688 - x1_acc: 0.6931 - x2_acc: 0.4429 - x3_acc: 0.6815 
148480/747322 [====>.........................] - ETA: 78s - loss: 20.7078 - x1_loss: 9.5762 - x2_loss: 10.0636 - x3_loss: 1.0680 - x1_acc: 0.6932 - x2_acc: 0.4430 - x3_acc: 0.6815 
149504/747322 [=====>........................] - ETA: 77s - loss: 20.6987 - x1_loss: 9.5749 - x2_loss: 10.0555 - x3_loss: 1.0683 - x1_acc: 0.6931 - x2_acc: 0.4430 - x3_acc: 0.6817 
150528/747322 [=====>........................] - ETA: 77s - loss: 20.9883 - x1_loss: 9.5688 - x2_loss: 10.3509 - x3_loss: 1.0686 - x1_acc: 0.6928 - x2_acc: 0.4428 - x3_acc: 0.6819 
151552/747322 [=====>........................] - ETA: 77s - loss: 20.9721 - x1_loss: 9.5606 - x2_loss: 10.3435 - x3_loss: 1.0679 - x1_acc: 0.6927 - x2_acc: 0.4426 - x3_acc: 0.6821 
152576/747322 [=====>........................] - ETA: 77s - loss: 20.9585 - x1_loss: 9.5558 - x2_loss: 10.3355 - x3_loss: 1.0672 - x1_acc: 0.6926 - x2_acc: 0.4425 - x3_acc: 0.6822 
153600/747322 [=====>........................] - ETA: 77s - loss: 20.9409 - x1_loss: 9.5447 - x2_loss: 10.3300 - x3_loss: 1.0662 - x1_acc: 0.6925 - x2_acc: 0.4426 - x3_acc: 0.6822 
154624/747322 [=====>........................] - ETA: 77s - loss: 20.9254 - x1_loss: 9.5341 - x2_loss: 10.3250 - x3_loss: 1.0663 - x1_acc: 0.6924 - x2_acc: 0.4425 - x3_acc: 0.6825 
155648/747322 [=====>........................] - ETA: 77s - loss: 20.9189 - x1_loss: 9.5270 - x2_loss: 10.3249 - x3_loss: 1.0670 - x1_acc: 0.6925 - x2_acc: 0.4425 - x3_acc: 0.6825 
156672/747322 [=====>........................] - ETA: 76s - loss: 20.9069 - x1_loss: 9.5155 - x2_loss: 10.3256 - x3_loss: 1.0658 - x1_acc: 0.6927 - x2_acc: 0.4423 - x3_acc: 0.6827 
157696/747322 [=====>........................] - ETA: 76s - loss: 20.9275 - x1_loss: 9.5461 - x2_loss: 10.3163 - x3_loss: 1.0651 - x1_acc: 0.6927 - x2_acc: 0.4422 - x3_acc: 0.6828 
158720/747322 [=====>........................] - ETA: 76s - loss: 21.4809 - x1_loss: 10.1018 - x2_loss: 10.3133 - x3_loss: 1.0659 - x1_acc: 0.6928 - x2_acc: 0.4422 - x3_acc: 0.6829 
159744/747322 [=====>........................] - ETA: 76s - loss: 21.4617 - x1_loss: 10.0871 - x2_loss: 10.3093 - x3_loss: 1.0653 - x1_acc: 0.6928 - x2_acc: 0.4421 - x3_acc: 0.6830 
160768/747322 [=====>........................] - ETA: 76s - loss: 21.5462 - x1_loss: 10.1705 - x2_loss: 10.3105 - x3_loss: 1.0652 - x1_acc: 0.6928 - x2_acc: 0.4420 - x3_acc: 0.6832 
161792/747322 [=====>........................] - ETA: 76s - loss: 21.5642 - x1_loss: 10.1849 - x2_loss: 10.3138 - x3_loss: 1.0655 - x1_acc: 0.6928 - x2_acc: 0.4418 - x3_acc: 0.6832 
162816/747322 [=====>........................] - ETA: 76s - loss: 21.5508 - x1_loss: 10.1739 - x2_loss: 10.3118 - x3_loss: 1.0651 - x1_acc: 0.6928 - x2_acc: 0.4418 - x3_acc: 0.6833 
163840/747322 [=====>........................] - ETA: 76s - loss: 21.5323 - x1_loss: 10.1606 - x2_loss: 10.3057 - x3_loss: 1.0659 - x1_acc: 0.6927 - x2_acc: 0.4419 - x3_acc: 0.6833 
164864/747322 [=====>........................] - ETA: 75s - loss: 21.5282 - x1_loss: 10.1607 - x2_loss: 10.3016 - x3_loss: 1.0659 - x1_acc: 0.6926 - x2_acc: 0.4418 - x3_acc: 0.6834 
165888/747322 [=====>........................] - ETA: 75s - loss: 21.5321 - x1_loss: 10.1696 - x2_loss: 10.2963 - x3_loss: 1.0662 - x1_acc: 0.6927 - x2_acc: 0.4417 - x3_acc: 0.6834 
166912/747322 [=====>........................] - ETA: 75s - loss: 21.5131 - x1_loss: 10.1554 - x2_loss: 10.2912 - x3_loss: 1.0664 - x1_acc: 0.6927 - x2_acc: 0.4416 - x3_acc: 0.6833 
167936/747322 [=====>........................] - ETA: 75s - loss: 21.5211 - x1_loss: 10.1649 - x2_loss: 10.2886 - x3_loss: 1.0676 - x1_acc: 0.6929 - x2_acc: 0.4415 - x3_acc: 0.6835 
168960/747322 [=====>........................] - ETA: 75s - loss: 21.5049 - x1_loss: 10.1504 - x2_loss: 10.2870 - x3_loss: 1.0676 - x1_acc: 0.6930 - x2_acc: 0.4414 - x3_acc: 0.6835 
169984/747322 [=====>........................] - ETA: 75s - loss: 21.5171 - x1_loss: 10.1684 - x2_loss: 10.2818 - x3_loss: 1.0670 - x1_acc: 0.6931 - x2_acc: 0.4414 - x3_acc: 0.6832 
171008/747322 [=====>........................] - ETA: 75s - loss: 21.5036 - x1_loss: 10.1541 - x2_loss: 10.2816 - x3_loss: 1.0678 - x1_acc: 0.6931 - x2_acc: 0.4413 - x3_acc: 0.6828 
172032/747322 [=====>........................] - ETA: 75s - loss: 21.4870 - x1_loss: 10.1377 - x2_loss: 10.2816 - x3_loss: 1.0677 - x1_acc: 0.6931 - x2_acc: 0.4413 - x3_acc: 0.6827 
173056/747322 [=====>........................] - ETA: 75s - loss: 21.4729 - x1_loss: 10.1210 - x2_loss: 10.2836 - x3_loss: 1.0683 - x1_acc: 0.6931 - x2_acc: 0.4413 - x3_acc: 0.6824 
174080/747322 [=====>........................] - ETA: 74s - loss: 21.4512 - x1_loss: 10.1085 - x2_loss: 10.2742 - x3_loss: 1.0685 - x1_acc: 0.6931 - x2_acc: 0.4414 - x3_acc: 0.6821 
175104/747322 [======>.......................] - ETA: 74s - loss: 21.4315 - x1_loss: 10.0977 - x2_loss: 10.2647 - x3_loss: 1.0690 - x1_acc: 0.6931 - x2_acc: 0.4414 - x3_acc: 0.6817 
176128/747322 [======>.......................] - ETA: 74s - loss: 21.4231 - x1_loss: 10.0880 - x2_loss: 10.2656 - x3_loss: 1.0695 - x1_acc: 0.6932 - x2_acc: 0.4412 - x3_acc: 0.6813 
177152/747322 [======>.......................] - ETA: 74s - loss: 21.4059 - x1_loss: 10.0732 - x2_loss: 10.2639 - x3_loss: 1.0688 - x1_acc: 0.6931 - x2_acc: 0.4412 - x3_acc: 0.6809 
178176/747322 [======>.......................] - ETA: 74s - loss: 21.4289 - x1_loss: 10.0967 - x2_loss: 10.2634 - x3_loss: 1.0688 - x1_acc: 0.6930 - x2_acc: 0.4413 - x3_acc: 0.6807 
179200/747322 [======>.......................] - ETA: 74s - loss: 21.4329 - x1_loss: 10.1092 - x2_loss: 10.2557 - x3_loss: 1.0681 - x1_acc: 0.6930 - x2_acc: 0.4414 - x3_acc: 0.6807 
180224/747322 [======>.......................] - ETA: 74s - loss: 21.4277 - x1_loss: 10.1099 - x2_loss: 10.2503 - x3_loss: 1.0675 - x1_acc: 0.6930 - x2_acc: 0.4415 - x3_acc: 0.6807 
181248/747322 [======>.......................] - ETA: 73s - loss: 21.4088 - x1_loss: 10.0975 - x2_loss: 10.2441 - x3_loss: 1.0671 - x1_acc: 0.6929 - x2_acc: 0.4416 - x3_acc: 0.6808 
182272/747322 [======>.......................] - ETA: 73s - loss: 21.3909 - x1_loss: 10.0841 - x2_loss: 10.2405 - x3_loss: 1.0663 - x1_acc: 0.6929 - x2_acc: 0.4415 - x3_acc: 0.6811 
183296/747322 [======>.......................] - ETA: 73s - loss: 21.3775 - x1_loss: 10.0699 - x2_loss: 10.2416 - x3_loss: 1.0660 - x1_acc: 0.6927 - x2_acc: 0.4415 - x3_acc: 0.6813 
184320/747322 [======>.......................] - ETA: 73s - loss: 21.3682 - x1_loss: 10.0664 - x2_loss: 10.2355 - x3_loss: 1.0662 - x1_acc: 0.6928 - x2_acc: 0.4417 - x3_acc: 0.6818 
185344/747322 [======>.......................] - ETA: 73s - loss: 21.4162 - x1_loss: 10.1213 - x2_loss: 10.2291 - x3_loss: 1.0658 - x1_acc: 0.6927 - x2_acc: 0.4417 - x3_acc: 0.6821 
186368/747322 [======>.......................] - ETA: 73s - loss: 21.3981 - x1_loss: 10.1050 - x2_loss: 10.2259 - x3_loss: 1.0672 - x1_acc: 0.6928 - x2_acc: 0.4418 - x3_acc: 0.6825 
187392/747322 [======>.......................] - ETA: 73s - loss: 21.3793 - x1_loss: 10.0909 - x2_loss: 10.2212 - x3_loss: 1.0673 - x1_acc: 0.6928 - x2_acc: 0.4417 - x3_acc: 0.6827 
188416/747322 [======>.......................] - ETA: 73s - loss: 21.3614 - x1_loss: 10.0784 - x2_loss: 10.2163 - x3_loss: 1.0668 - x1_acc: 0.6930 - x2_acc: 0.4418 - x3_acc: 0.6830 
189440/747322 [======>.......................] - ETA: 72s - loss: 21.3736 - x1_loss: 10.0909 - x2_loss: 10.2169 - x3_loss: 1.0659 - x1_acc: 0.6930 - x2_acc: 0.4417 - x3_acc: 0.6833 
190464/747322 [======>.......................] - ETA: 72s - loss: 21.4615 - x1_loss: 10.0802 - x2_loss: 10.3165 - x3_loss: 1.0648 - x1_acc: 0.6930 - x2_acc: 0.4418 - x3_acc: 0.6836 
191488/747322 [======>.......................] - ETA: 72s - loss: 21.4493 - x1_loss: 10.0653 - x2_loss: 10.3194 - x3_loss: 1.0646 - x1_acc: 0.6930 - x2_acc: 0.4417 - x3_acc: 0.6837 
192512/747322 [======>.......................] - ETA: 72s - loss: 21.4863 - x1_loss: 10.0997 - x2_loss: 10.3207 - x3_loss: 1.0659 - x1_acc: 0.6927 - x2_acc: 0.4416 - x3_acc: 0.6837 
193536/747322 [======>.......................] - ETA: 72s - loss: 21.4750 - x1_loss: 10.0895 - x2_loss: 10.3198 - x3_loss: 1.0657 - x1_acc: 0.6929 - x2_acc: 0.4416 - x3_acc: 0.6839 
194560/747322 [======>.......................] - ETA: 72s - loss: 21.4577 - x1_loss: 10.0755 - x2_loss: 10.3168 - x3_loss: 1.0654 - x1_acc: 0.6929 - x2_acc: 0.4416 - x3_acc: 0.6839 
195584/747322 [======>.......................] - ETA: 72s - loss: 21.4429 - x1_loss: 10.0627 - x2_loss: 10.3148 - x3_loss: 1.0655 - x1_acc: 0.6929 - x2_acc: 0.4417 - x3_acc: 0.6838 
196608/747322 [======>.......................] - ETA: 71s - loss: 21.4307 - x1_loss: 10.0558 - x2_loss: 10.3089 - x3_loss: 1.0660 - x1_acc: 0.6929 - x2_acc: 0.4418 - x3_acc: 0.6834 
197632/747322 [======>.......................] - ETA: 71s - loss: 21.4446 - x1_loss: 10.0669 - x2_loss: 10.3107 - x3_loss: 1.0670 - x1_acc: 0.6929 - x2_acc: 0.4418 - x3_acc: 0.6830 
198656/747322 [======>.......................] - ETA: 71s - loss: 21.4287 - x1_loss: 10.0552 - x2_loss: 10.3071 - x3_loss: 1.0665 - x1_acc: 0.6930 - x2_acc: 0.4418 - x3_acc: 0.6827 
199680/747322 [=======>......................] - ETA: 71s - loss: 21.4168 - x1_loss: 10.0474 - x2_loss: 10.3034 - x3_loss: 1.0660 - x1_acc: 0.6931 - x2_acc: 0.4417 - x3_acc: 0.6823 
200704/747322 [=======>......................] - ETA: 71s - loss: 21.4064 - x1_loss: 10.0385 - x2_loss: 10.3015 - x3_loss: 1.0664 - x1_acc: 0.6931 - x2_acc: 0.4417 - x3_acc: 0.6819 
201728/747322 [=======>......................] - ETA: 71s - loss: 21.3954 - x1_loss: 10.0320 - x2_loss: 10.2974 - x3_loss: 1.0659 - x1_acc: 0.6931 - x2_acc: 0.4416 - x3_acc: 0.6817 
202752/747322 [=======>......................] - ETA: 71s - loss: 21.3870 - x1_loss: 10.0243 - x2_loss: 10.2965 - x3_loss: 1.0662 - x1_acc: 0.6931 - x2_acc: 0.4415 - x3_acc: 0.6816 
203776/747322 [=======>......................] - ETA: 70s - loss: 21.3782 - x1_loss: 10.0155 - x2_loss: 10.2954 - x3_loss: 1.0673 - x1_acc: 0.6929 - 

etc... 
+0

请添加更多详情 – Enn

+0

@Enn,增加了一些信息。帮助? –

+0

我在Ubuntu 16.04上遇到了同样的问题,但在macOS 10.13上没有问题。 Haven没有找到解决方案。 – McLawrence

回答

1

我也有类似的问题,但还没有进一步研究它的时候。这个问题似乎与keras的generic_utils.py中的Progbar类有关,参见link,也许Python> = 3.3。

107行::

以下线路中的类的更新功能被发现sys.stdout.write('\b' * prev_total_width)
线108:sys.stdout.write('\r')

我简单地去除线107作为一个快速解决,所以不是退格的上一行然后执行转移到行的开头,我只执行转移。我想有比改变源代码更好的方法。

+2

@fchollet,有什么想法? –

1

这似乎是Keras一致的问题。我试图找到该行

sys.stdout.write('\b' * prev_total_width)

sys.stdout.write('\r')

在Keras/utils的/ generic_utils.py文件,他们是(以当前版本),在258和259相应。我评论258,但这似乎不能解决问题。我还是设法通过注释行,使进度条工作:

行303:sys.stdout.write(info)

它看来,如果信息使得杆过长,终端,因此它打破了一个新的生产线。

所以我最终解决了这个问题。现在看来似乎是在年底相当简单....

就使得终端更宽......

注:在测试了Linux操作系统Ubuntu 16.04 | Keras版本2.0.5

0

之前已经提到过,但我会将其重写为对未来用户更明显。

你有太窄终端打印所有这些值秒 - 只是设置参数的构造函数的Progbarwidth到数量较少或删除/重命名一些提供的值。