2016-11-26 76 views
3

为了进一步理解RNN和LSTM,我试图实现一个简单的LSTM来估计正弦波的频率和相位。这被证明是非常难以收敛的。 MSE是相当高的(以千计) 似乎工作一点的唯一的事情是,如果我生成全部具有相同相位的正弦波(同时开始)并且训练样本作为矢量传递而不是在RNN中一次一个采样。同时,这里是不会收敛的代码。在这段代码中,我已经删除了每个频率的不同相位 对此处出现的问题有任何想法RNN LSTM估计正弦波频率和相位

我看过这个Keras : How should I prepare input data for RNN?并试图修改我的输入,但没有运气。

from keras.models import Sequential 
from keras.layers.core import Activation, Dropout ,Dense 
from keras.layers.recurrent import GRU, LSTM 
import numpy as np 
from sklearn.cross_validation import train_test_split 

np.random.seed(0) # For reproducability 
TrainingNums = 12000 #Number of Trials 
numSampleInEach = 200 #Length of each sinewave 
numPhaseExamples = 1 #for each freq, so many different phases 

X = np.zeros((TrainingNums,numSampleInEach)) 
Y = np.zeros((TrainingNums,2)) 

#create sinewaves here 
for iii in range(0, TrainingNums//numPhaseExamples): 
    freq = np.round(np.random.randn()*100) 
    for kkk in range(0,numPhaseExamples): 
    #set timeOffset below to 0, if you want the same phase every run 
     timeOffset = 0# 0 for now else np.random.randint(0,90) 
     X[iii*numPhaseExamples+kkk,:] = np.sin(2*3.142*freq*np.linspace(0+timeOffset,numSampleInEach-1+timeOffset,numSampleInEach)/10000) 
     Y[iii*numPhaseExamples+kkk,0] = freq 
     Y[iii*numPhaseExamples+kkk,1] = timeOffset 

X = np.reshape(X,(TrainingNums, numSampleInEach,1)) 
#This below works when there is no phase variation 
#X = np.reshape(X,(TrainingNums, numSampleInEach,1)) 

X_train, X_test, y_train, y_test = train_test_split(X, Y, test_size=0.33) 

#Now create the RNN 
model = Sequential() 
#batch_input_shape = [batch_size,timeStep,dataDimension] 
model.add(LSTM(128,input_shape= (numSampleInEach,1),return_sequences=True)) 

#For things to work for freq estimation only the following change helps 
#model.add(LSTM(128,input_shape=(1,numSampleInEach),return_sequences=True)) 
model.add(Dropout(0.2)) 
model.add(Activation("relu")) 

#second layer of RNN 
model.add(LSTM(128,return_sequences=False)) 
model.add(Dropout(0.2)) 
model.add(Activation("relu")) 

model.add(Dense(2,activation="linear")) 
model.compile(loss="mean_squared_error", optimizer="Nadam") 
print model.summary() 

print "Model compiled." 
model.fit(X_train, y_train, batch_size=16, nb_epoch=150, 
     validation_split=0.1) 
result = model.evaluate(X_test, y_test, verbose=0) 
print 'mse: ', result 

所以问题是:

  1. 是不是期待的RNN估计频率和相位?
  2. 我尝试了几个架构(多层LSTM,单层与更多的节点等)。我也尝试过不同的体系结构。
+1

取得了一些进展,在这里记录了他人。主要是在LSTM之后不应该有激活。 LSTM层之间的激活是错误 – krat

回答

1

卸下LSTM后的激活是正确的答案