2012-02-24 255 views
1

我想训练使用不同批量的神经网络,但我不知道如何将合成网络合并在一起。matlab神经网络训练批量大小

下面是我编写的以批量大小作为参数来训练网络的代码。

%% Train the Network using batches 
batch_size = 50; 

total_size = size(inputs,2); 
batch_num = ceil(total_size/batch_size); 

for i = 1:batch_num 
    start_index = i + batch_size * (i - 1); 
    end_index = batch_size + batch_size * (i - 1); 

    if i == batch_num 
     end_index = total_size; 
    end 

    [net,tr] = train(net,inputs(:,start_index:end_index), targets(:,start_index:end_index)); 
end 

这是网和TR的结构

TR =

trainFcn: 'traingdm' 
    trainParam: [1x1 nnetParam] 
    performFcn: 'mse' 
performParam: [1x1 nnetParam] 
    derivFcn: 'defaultderiv' 
    divideFcn: 'dividerand' 
    divideMode: 'sample' 
divideParam: [1x1 nnetParam] 
    trainInd: [1x538 double] 
     valInd: [1x115 double] 
     ... 

净值=

Neural Network 

      name: 'Pattern Recognition Neural Network' 
    efficiency: .cacheDelayedInputs, .flattenTime, 
       .memoryReduction 
     userdata: (your custom info) 

dimensions: 

    numInputs: 1 
    numLayers: 4 
    numOutputs: 1 
numInputDelays: 0 
numLayerDelays: 0 
numFeedbackDelays: 0 
numWeightElements: 845 
    sampleTime: 1 

connections: 

    biasConnect: [1; 1; 1; 1] 
    inputConnect: [1; 0; 0; 0] 
    layerConnect: [4x4 boolean] 
outputConnect: [0 0 0 1] 

subobjects: 

     inputs: {1x1 cell array of 1 input} 
     layers: {4x1 cell array of 4 layers} 
     outputs: {1x4 cell array of 1 output} 
     biases: {4x1 cell array of 4 biases} 
    inputWeights: {4x1 cell array of 1 weight} 
    layerWeights: {4x4 cell array of 3 weights} 
    ... 

我怎么会得到结果net变量来保存生成的神经所有批次都完成后,净重?

回答

0

如果我理解正确,您将覆盖变量nettr。只需使用一个电池阵列:

使用之初它声明:

net = {}; 
tr = {}; 

及相关行更改为:

[net{end+1},tr{end+1}] = ... 
+0

嗯,看来网和TR更加复杂 – waspinator 2012-02-26 13:45:23

+0

它无论如何应该工作,因为你可以制作任何类型的对象的单元数组,无论它有多复杂。它工作吗? – 2012-02-26 13:48:54

+0

不幸的是没有。网络是1x1网络。这是我得到的错误:???逗号分隔列表扩展具有 不是单元格的数组的语法 。 错误==>在122 [净{端+ 1},{TR +端1}] = 列车(净,输入(分类:,START_INDEX:END_INDEX), 目标(:,START_INDEX:END_INDEX) ); – waspinator 2012-02-29 17:25:48