1. Neural Turing Machine Mark Chang
2. -> -> (Neural Turing Machine)
3.
http://humanphisiology.wikispaces.com/file/view/neuron.png/216460
814/neuron.png
http://upload.wikimedia.org/wikipedia/commons/thumb/4
/4a/Action_potential.svg/1037px-Action_potential.svg.png
4.
http://www.quia.com/files/quia/users/lmcgee/Systems/endocrine-nervous/synapse.gif
5. nW1 W2 x1 x2 b Wb y nin nout
6. (0,0) x2 x1 1 0
7. AND Gate x1 x2 y 0 0 0 0 1 0 1 0 0 1 1 1 (0,0) (0,1) (1,1)
(1,0) 0 1 n20 20 b -30 yx1 x2
8. OR Gate x1 x2 y 0 0 0 0 1 1 1 0 1 1 1 1 (0,0) (0,1) (1,1)
(1,0) 0 1 n20 20 b -10 yx1 x2
9. XOR Gate ? (0,0) (0,1) (1,1) (1,0) 0 0 1 x1 x2 y 0 0 0 0 1 1
1 0 1 1 1 0
10. XOR Gate n -20 20 b -10 y (0,0) (0,1) (1,1) (1,0) 0 1 (0,0)
(0,1) (1,1) (1,0) 1 0 (0,0) (0,1) (1,1) (1,0) 0 0 1 n1 20 20 b -30
x1 x2 n2 20 20 b -10 x1 x2 x1 x2 n1 n2 y 0 0 0 0 0 0 1 0 1 1 1 0 0
1 1 1 1 1 1 0
11. x y n11 n12 n21 n22W12,y W12,x b W11,y W11,bW12,b b W11,x
W21,11 W22,12 W21,12 W22,11 W21,bW22,b z1 z2 Input Layer Hidden
Layer Output Layer
12.
http://www.nature.com/neuro/journal/v8/n8/images/nn0805-975-F1.jpg
13.
14. http://www.pnas.org/content/102/49/17846/F7.large.jpg
15. w Forward Propagation Error Function Backward
Propagation
16. Forward Propagation Error Function Backward
Propagation
17. W-NN W x y n11 n12 n21 n22W12,y W12,x b W11,y W11,bW12,b b
W11,x W21,11 W22,12 W21,12 W22,11 W21,bW22,b z1 z2
18. Forward Propagation
19. Forward Propagation
20. Error Function n21 n22 z1 z2
21. w1 w0 Gradient Descent
22. Backward Propagation
23. Backward Propagation
24. Backward Propagation
25. Backward Propagation
26. Backward Propagation
27. Backward Propagation
28. Backward Propagation
29. Backward Propagation
http://cpmarkchang.logdown.com/posts/277349-neural-network-backward-propagation
30. Neural Network
31. ..
32. n() n() nW1 W2 x1 x2 b Wb y nW1 W2 x1 x2 b Wb y
33. Recurrent Neural Network n(n(),) n() n(n(n(),),)
34. Feedforward Neural Network Recurrent Neural Network Long
Short Term MemoryNeural Turing Machine
35. Recurrent Neural Network noutnin
36. Recurrent Neural Network . x0 y0 y1 x1 x2 y2 yt xt
37. Recurrent Neural Network x0 x1 xt-1 xt y0 y1 yt-1 yt
38. Backward Propagation Through Time t = 0 t = 1
39. Backward Propagation Through Time
http://cpmarkchang.logdown.com/posts/278457-neural-network-recurrent-neural-network
40. Recurrent Neural Network
41. Vanishing Gradient Problem
42. Long-Short Term Memory xt m yt Cin c cc k n b nout Memory
Cell kout CreadCforgetCwrite mout,t mout,t-1 Coutmin,t
43. Long-Short Term Memory Cin Cread Cforget Cwrite Cout
44. Long-Short Term Memory Cwrite
45. Long-Short Term Memory Cforget
46. Long-Short Term Memory Cread
47. Training: Backward Propagation
http://www.felixgers.de/papers/phd.pdf
48. Long-Short Term Memory
https://class.coursera.org/neuralnets-2012-001/lecture/95
49. Neural Turing Machine Input Output Read/Write Head
controller Memory
50. Memory Memory Address Memory Block Block Length 0 1 i n 0 j
m
51. Read Operation 11 2 21 3 42 1 Read Operation: 0 000.9 0.1 0
1 i n Read Vector: Head Location: Memory : 1.1 1.0 2.2
52. Erase Operation Erase Operation: 0 1 1 11 2 21 3 42 1 0
000.9 0.1 0 1 i n 0 j m 11 2 3 1 0.1 1.8 0.2 3.6 Head Location:
Erase Vector: Memory :
53. Add Operation Add Operation: 1 1 0 0 000.9 0.1 0 1 i n 11 2
3 1 0.1 1.8 0.2 3.6 2 3 10.2 3.6 1.9 1.9 1.1 1.0 Add Vector: Memory
: Head Location: 0 j m
54. Controller controller Input Read Vector: Head Location:
Output Add Vector: Erase Vector: Addressing Mechanisms Content
Addressing Parameter: Interpolation Parameter: Convolutional Shift
Parameter: Sharpening Parameter: Memory Key:
55. 0 0000 1 .45 .05 .500 0 0 .45 .05 .50 0 0 0 0 0 0 1 0 0
Head Location: 11 2 04 0 21 3 01 1 42 1 15 00 000.9 0.1 Head
Location: Memory:Previous State 2 3 1 Memory Key: 00 1 Controller
Outputs Content Addressing Interpolation Convolutional Shift
Sharpening
56. Content Addressing 11 2 04 0 21 3 01 1 42 1 15 0 2 3 1 .16
.16 .16 .16 .16 .160 0000 1 .15 .10 .47 .08 .13 .17 Memory
Key:Memory : Head Location:
57. Interpolation 0 000.9 0.1 0 0000 1 0 0000 1 0 000.9 0.1.45
.05 .50 0 0 0
58. Convolutional Shift .45 .05 .50 0 0 0 .45 .05 .50 0 0 0
.45.05 .50 0 0 0 .45 .05 .500 0 0 .45 .05 .50 0 0 0 .025 .475 .025
.25 0 .225 01 0 00 1 .5 0 .5 -1 0 1-1 0 1 -1 0 1
59. Sharpening 0 0 0 1 0 0 0 .37 0 .62 0 0 0 .45 .05 .50 0 0
.16 .16 .16 .16 .16 .16
60. Neural Turing Machine Implementation
http://awawfumin.blogspot.tw/2015/03/neural-turing-machines-implementation.html
61. Experiment: Repeat Copy https://github.com/fumin/ntm
62. Evolution of Recurrent Neural Network Recurrent Neural
Network Long Short Term Memory Neural Turing Machine
63. Neural Turing Machine
64. Logistic Regression
http://cpmarkchang.logdown.com/posts/189069-logisti-regression-model
Overfitting and Regularization
http://cpmarkchang.logdown.com/posts/193261-machine-learning-overfitting-and-regularization
Model Selection
http://cpmarkchang.logdown.com/posts/193914-machine-learning-model-selection
Neural Network Backward Propagation
http://cpmarkchang.logdown.com/posts/277349-neural-network-backward-propagation
Recurrent Neural Network
http://cpmarkchang.logdown.com/posts/278457-neural-network-recurrent-neural-network
Long Short Term Memory
http://deeplearning.cs.cmu.edu/pdfs/Hochreiter97_lstm.pdf
http://www.felixgers.de/papers/phd.pdf Neural Turing Machine
http://arxiv.org/pdf/1410.5401.pdf
http://awawfumin.blogspot.tw/2015/03/neural-turing-machines-implementation.html
65. https://www.coursera.org/course/ntumlone
https://www.coursera.org/course/ntumltwo
https://www.youtube.com/playlist?list=PL6Xpj9I5
qXYEcOhn7TqghAJ6NAPrNmUBH
https://www.coursera.org/course/neuralnets
66. https://github.com/fumin/ntm
67. Mark Chang facebook https://www.facebook.com/ckmarkoh.chang
Githubhttp://github.com/ckmarkoh Bloghttp://cpmarkchang.logdown.com
emailckmarkoh at gmail.com Fumin Githubhttps://github.com/fumin
Emailawawfumin at gmail.com
LOAD MORE