2016-04-28 314 views
0

我试图用完全连接的层来连接解卷积层以进行处理的安全层,但是当我使用FC层对连续解卷层进行重构后,批量大小发生改变,这是过程至极我做的:具有解卷积层的Concat FC层

...... 

    layer { 
     name: "fc4" 
     type: "InnerProduct" 
     bottom: "fc3" 
     top: "fc4" 
     param { 
     lr_mult: 1 
     decay_mult: 1 
     } 
     param { 
     lr_mult: 2 
     decay_mult: 0 
     } 
     inner_product_param { 
     num_output: 21904 
     weight_filler { 
      type: "gaussian" 
      std: 0.01 
     } 
     bias_filler { 
      type: "constant" 
      value: 0 
     } 
     } 
    } 
    layer { 
     name: "deconv" 
     type: "Deconvolution" 
     bottom: "data" 
     top: "deconv" 
     param { 
     lr_mult: 1 
     decay_mult: 1 
     } 
     param { 
     lr_mult: 2 
     decay_mult: 0 
     } 
     convolution_param { 
     num_output: 256 
     pad: 1 
     kernel_size: 8 
     weight_filler { 
      type: "gaussian" 
      std: 0.01 
     } 
     bias_filler { 
      type: "constant" 
      value: 0 
     } 
     } 
    } 
    layer { 
     type: "Flatten" 
     bottom: "deconv" 
     top: "dec_flatten" 
     name: "dec_flatten" 
    } 
    layer { 
     name: "concat" 
     bottom: "fc4" 
     bottom: "dec_flatten" 
     top: "concat" 
     type: "Concat" 
     concat_param { 
     axis: 1 
     } 
    } 
    layer { 
     name: "reshape" 
     type: "Reshape" 
     bottom: "concat" 
     top: "output" 
     reshape_param { 
     shape { 
     dim: -1 
     dim: 1 
     dim: 148 
     dim: 148 
     } 
     } 
    } 
    layer { 
     name: "conv1" 
     type: "Convolution" 
     bottom: "output" 
     top: "conv1" 
     param { 
     lr_mult: 1 
     decay_mult: 1 
     } 
     param { 
     lr_mult: 2 
     decay_mult: 0 
     } 
     convolution_param { 
     num_output: 16 
     kernel_size: 5 
     stride: 1 
     weight_filler { 
      type: "gaussian" 
      std: 0.01 
     } 
     bias_filler { 
      type: "constant" 
      value: 0 
     } 
     } 
    } 
    .... 

在终端我有这样的输出:

0428 11:45:41.201318 52048 net.cpp:454] fc4 <- fc3 
I0428 11:45:41.201325 52048 net.cpp:411] fc4 -> fc4 
I0428 11:45:41.320688 52048 net.cpp:150] Setting up fc4 
I0428 11:45:41.320735 52048 net.cpp:157] Top shape: 1 21904 (21904) 
I0428 11:45:41.320740 52048 net.cpp:165] Memory required for data: 114240 
I0428 11:45:41.320752 52048 layer_factory.hpp:76] Creating layer deconv 
I0428 11:45:41.320770 52048 net.cpp:106] Creating Layer deconv 
I0428 11:45:41.320776 52048 net.cpp:454] deconv <- data_data_0_split_1 
I0428 11:45:41.320786 52048 net.cpp:411] deconv -> deconv 
I0428 11:45:41.322069 52048 net.cpp:150] Setting up deconv 
I0428 11:45:41.322095 52048 net.cpp:157] Top shape: 1 256 37 37 (350464) 
I0428 11:45:41.322100 52048 net.cpp:165] Memory required for data: 1516096 
I0428 11:45:41.322110 52048 layer_factory.hpp:76] Creating layer dec_flatten 
I0428 11:45:41.322119 52048 net.cpp:106] Creating Layer dec_flatten 
I0428 11:45:41.322124 52048 net.cpp:454] dec_flatten <- deconv 
I0428 11:45:41.322130 52048 net.cpp:411] dec_flatten -> dec_flatten 
I0428 11:45:41.322156 52048 net.cpp:150] Setting up dec_flatten 
I0428 11:45:41.322163 52048 net.cpp:157] Top shape: 1 350464 (350464) 
I0428 11:45:41.322167 52048 net.cpp:165] Memory required for data: 2917952 
I0428 11:45:41.322171 52048 layer_factory.hpp:76] Creating layer concat 
I0428 11:45:41.322180 52048 net.cpp:106] Creating Layer concat 
I0428 11:45:41.322183 52048 net.cpp:454] concat <- fc4 
I0428 11:45:41.322188 52048 net.cpp:454] concat <- dec_flatten 
I0428 11:45:41.322194 52048 net.cpp:411] concat -> concat 
I0428 11:45:41.322216 52048 net.cpp:150] Setting up concat 
I0428 11:45:41.322223 52048 net.cpp:157] Top shape: 1 372368 (372368) 
I0428 11:45:41.322227 52048 net.cpp:165] Memory required for data: 4407424 
I0428 11:45:41.322232 52048 layer_factory.hpp:76] Creating layer reshape 
I0428 11:45:41.322242 52048 net.cpp:106] Creating Layer reshape 
I0428 11:45:41.322247 52048 net.cpp:454] reshape <- concat 
I0428 11:45:41.322252 52048 net.cpp:411] reshape -> output 
I0428 11:45:41.322283 52048 net.cpp:150] Setting up reshape 
I0428 11:45:41.322295 52048 net.cpp:157] Top shape: 17 1 148 148 (372368) 
I0428 11:45:41.322311 52048 net.cpp:165] Memory required for data: 5896896 
I0428 11:45:41.322315 52048 layer_factory.hpp:76] Creating layer conv1 
I0428 11:45:41.322325 52048 net.cpp:106] Creating Layer conv1 
I0428 11:45:41.322330 52048 net.cpp:454] conv1 <- output 
I0428 11:45:41.322337 52048 net.cpp:411] conv1 -> conv1 
I0428 11:45:41.323410 52048 net.cpp:150] Setting up conv1 
I0428 11:45:41.323438 52048 net.cpp:157] Top shape: 17 16 144 144 (5640192) 

当我做我的重塑批量大小的变化,他们知道如何没有改变重塑批量大小?

由于

回答

0

concat斑点的形状是1x372368

现在你正尝试与所述prototxt重塑此blob数据:

reshape_param { 
    shape { 
    dim: -1 
    dim: 1 
    dim: 148 
    dim: 148 
    } 

作为H = 148的尺寸,W = 148,C = 1由您已经固定,的n值是通过372368/(148*148*1) = 17

计算如果您不想更改批处理计数,则必须更改h,w或c的值,以便h*w*c*n = 372368

编辑1: 编辑的答案: 现在,您的连接图层的每个图像的维度为207936,并且您试图通过使用1024维度标签来计算损失。这不会因为两个底部斑点的尺寸不同而工作。您可以尝试使用num_output:1024将inner product图层连接到concat图层的输出。然后可以将此inner product图层的输出与标签一起用于损失图层。

+0

谢谢Anoop,我会尝试。 –

+0

Anoop,想到这个解决方案我猜想哪个会改变我的结果,我做了一个测试,在conv3上设置了丢失层的两个底部,这样的训练工作,但这是错误的? 为了澄清我试图做的事,我遵循这篇文章:[链接](http://arxiv.org/pdf/1603.07235v2.pdf) –

+0

如果你的意思是说你试图设置相同的blob丢失层的输入,这是完全错误的。损失总是零,模型不会学到任何东西。当您要使用该模型进行训练时,更改原型文件不会改变您的结果。你能否详细说明你的意思。 –