2

我已经编码了基于这些笔记在Matlab中反向传播算法:http://dl.dropbox.com/u/7412214/BackPropagation.pdf反向传播算法(Matlab的):输出值被饱和为1

我的网络需要长度43的输入/特征矢量,已在20个节点隐藏层(我可以改变任意参数选择),并且具有单个输出节点。我想训练我的网络,使其具有43个特征,并输出0到100之间的单个值。将输入数据标准化为零均值和单位标准偏差(通过z = x - mean/std),然后附加“1 “输入向量来表示偏见的术语。我的targetValues只是单个数字0之间和100

这里是我的代码的相关部分:

(通过我的惯例,层I(ⅰ)是指输入层,J(J)是指隐藏层,和K(k)的指的是输出层,这是在这种情况下,一个单一的节点)

for train=1:numItrs 
     for iterator=1:numTrainingSets 

      %%%%%%%% FORWARD PROPAGATION %%%%%%%% 

      % Grab the inputs, which are rows of the inputFeatures matrix 
      InputLayer = inputFeatures(iterator, :)'; %don't forget to turn into column 
      % Calculate the hidden layer outputs: 
      HiddenLayer = sigmoidVector(WeightMatrixIJ' * InputLayer); 
      % Now the output layer outputs: 
      OutputLayer = sigmoidVector(WeightMatrixJK' * HiddenLayer); 

      %%%%%%% Debug stuff %%%%%%%% (for single valued output) 
      if (mod(train+iterator, 100) == 0) 
       str = strcat('Output value: ', num2str(OutputLayer), ' | Test value: ', num2str(targetValues(iterator, :)')); 
       disp(str); 
      end 




      %%%%%%%% BACKWARDS PROPAGATION %%%%%%%% 

      % Propagate backwards for the hidden-output weights 
      currentTargets = targetValues(iterator, :)'; %strip off the row, make it a column for easy subtraction 
      OutputDelta = (OutputLayer - currentTargets) .* OutputLayer .* (1 - OutputLayer); 
      EnergyWeightDwJK = HiddenLayer * OutputDelta'; %outer product 
      % Update this layer's weight matrix: 
      WeightMatrixJK = WeightMatrixJK - epsilon*EnergyWeightDwJK; %does it element by element 

      % Propagate backwards for the input-hidden weights 
      HiddenDelta = HiddenLayer .* (1 - HiddenLayer) .* WeightMatrixJK*OutputDelta; 
      EnergyWeightDwIJ = InputLayer * HiddenDelta'; 
      WeightMatrixIJ = WeightMatrixIJ - epsilon*EnergyWeightDwIJ; 

     end 

    end 

而且如下权重矩阵被初始化:

WeightMatrixIJ = rand(numInputNeurons, numHiddenNeurons) - 0.5; 
WeightMatrixJK = rand(numHiddenNeurons, numOutputNeurons) - 0.5; 
%randoms b/w (-0.5, 0.5) 

的“乙状结肠向量“函数采用向量中的每个元素并应用y = 1/(1 + exp(-x))

这里的调试消息是什么样子,从代码的开始:

Output value:0.99939 | Test value:20 
Output value:0.99976 | Test value:20 
Output value:0.99985 | Test value:20 
Output value:0.99989 | Test value:55 
Output value:0.99991 | Test value:65 
Output value:0.99993 | Test value:62 
Output value:0.99994 | Test value:20 
Output value:0.99995 | Test value:20 
Output value:0.99995 | Test value:20 
Output value:0.99996 | Test value:20 
Output value:0.99996 | Test value:20 
Output value:0.99997 | Test value:92 
Output value:0.99997 | Test value:20 
Output value:0.99997 | Test value:20 
Output value:0.99997 | Test value:20 
Output value:0.99997 | Test value:20 
Output value:0.99998 | Test value:20 
Output value:0.99998 | Test value:20 
Output value:0.99999 | Test value:20 
Output value:0.99999 | Test value:20 
Output value:1 | Test value:20 
Output value:1 | Test value:62 
Output value:1 | Test value:70 
Output value:1 | Test value:77 
Output value:1 | Test value:20 
** stays saturated at 1 ** 

很明显,我想网络训练输出值在0和100之间,试图匹配目标值!

谢谢你的帮助,如果你需要更多的信息我会尽我所能。

回答

3

sigmoid函数被限制在范围(0,1),所以它不会达到目标值(因为它们都大于1)。您应该缩放您的目标值,以便它们也在sigmoid的范围内。既然你知道你的目标值被限制在范围内(0,100),只需将它们全部除以100.