如果我理解正确的话,我可以通过添加RelU
后the hidden layer
,然后重复hidden layer
+ RelU
从3层NN要深入学习与RELU
我变换3-Layered NN
成DL NN
难以想象维度将如何运作。我现在已经从small library我放在一起,所以我可以在概念
M = 784 # 28 x 28 pixels
N = 512 # hidden neurons
P = 10 # number of possible classes
w1 = np.random.normal(0.0, pow(10, -0.5), (M, N))
w2 = np.random.normal(0.0, pow(10, -0.5), (N, P))
b1 = np.random.normal(0.0, pow(10, -0.5), (N))
b2 = np.random.normal(0.0, pow(10, -0.5), (P))
x = Input(w1, b1)
h = Hidden(x, w2, b2)
g = Softmax(h)
cost = CrossEntropy(g) # numpy.mean(CrossEntropy) over BATCH SIZE
train_data()
下沉以下但我想去
x = Input(w1, b1)
h = Hidden(x, w2, b2)
r = ReLU(h)
h2 = Hidden(r, ??, ??) # 1
r2 = ReLU(h2) # 2
.. <repeat 1 and 2>
g = Softmax(h)
cost = CrossEntropy(g) # numpy.mean(CrossEntropy) over BATCH SIZE
train_data()
Related article I a writing about this