Komachi Lab
M2
2016/06/14
Chainer
back propagation
Komachi Lab
Neural Net
1. forward
2. Loss Loss
3.
4. (AdaGrad )
5. 1
3
SGD
Komachi Lab
Neural Net2. Loss Loss
3.
4W
Loss
Loss
Komachi Lab
Neural Net
1. forward
2. Loss Loss
3.
4. (AdaGrad )
5. 1
5
SGD
W1 W2
predictinput
Komachi Lab
Neural Net
1. forward
2. Loss Loss
3.
4. (AdaGrad )
5. 1
6
SGD
Loss func
correct
W1 W2
predictinput
Komachi Lab
Neural Net
1. forward
2. Loss Loss
3.
4. (AdaGrad )
5. 1
7
SGD
gW1 gW2
Loss func
correct
W1 W2
predictinput
Komachi Lab
Neural Net
1. forward
2. Loss Loss
3.
4. (AdaGrad )
5. 1
8
SGD
gW1 gW2
Loss func
correct
W1 W2
predictinput
Komachi Lab
1. forward
2. Loss Loss
3.
4. (AdaGrad )
5. 1
10
gW1 gW2
Loss func
correct
W1 W2
predictinput
Komachi Lab
✤ W1 W2
✤ x, h, y, t
✤ Loss
13
(i train set index)
correct
W1 W2
predictinput :x
hidden:h:y :t
Komachi Lab
✤
14
W1, W2
correct
W1 W2
predictinput :x
hidden:h:y :t
Komachi Lab
✤
i
15
forward
correct
W1 W2
predictinput :x
hidden:h:y :t
Komachi Lab 16
gW2
gW1
Komachi Lab 17
zW4
xW3
hW1
yW2
Komachi Lab 18
(Chainer )
Komachi Lab
on Chainer
20
Komachi Lab
Chainer
✤ PFN DNN
http://chainer.org/
✤ Define-by-Run
Define-and-Run
✤
TensorFlow, Theano, Torch, Keras, Caffe, etc...
21
Komachi Lab
on Chainer
22
Komachi Lab 23
loss.backward()
Komachi Lab 24
loss.backward()
loss backward
Komachi Lab
MeanSquaredError
25
x0, x1 y, t
backward gy 1backward forward inputs gy
backward gy
Komachi Lab
Linear
26
grad_outputs Mean Squared Error backward
gW
backward
Komachi Lab
Linear
27
grad_outputs Linear backward
Komachi Lab
on Chainer✤
✤
✤
✤ W
28
tanh forward backwardchainer.functions.activation.tanh
forward Chain backward ChainChainer
Komachi Lab
✤
-
✤ Chainer
-
-
29
Komachi Lab
LSTM
30
forwarda, i, f, o
Linear
backward
gy