1
Speedy Alternatives to Back Propagation. John Mood v and Chris Darken. Departments of Computer Science and Physics, Yale University, P.O. Box 2158 Yale Station, New Haven, CT 06520 USA. Arpanet: [email protected]; Bitnet: [email protected]. We propose three new neuralty-inspired learning a l u m s which offer much greater speed and greater biolog~ plausibility than ~ Propagation. These algorithms include "Learning With Receptive Fields", a new Serf-Organizing Associative Memory, and a new variant of the Cerebellar Model Articulation Controller (CMAC). These new algorithms share one criticaJ feature in common: they utilize only one layer of intemal units. Furthermore, the Self-Organizing Associative Memory and the CMAC models require supervised learning of only the o u ~ t weights. These features result in increased speed. In detailed comparisons to Back Propagation for the problem of p r e d ~ Chaotic Time Series, these new algorithms leam as much as one t h o u u ~ ttm~ fNter ~ile achieving comparable prediction capability on test data. Back Propagation, however, achieves its performance with a smaller set of training data. These algorithms are likely to provi~ similar speed increases in other problem domains. The receptive field and seif-organ~ing models are in principle i ~en~ as analog dynamical systems; we discuss the dynamics of such systems, The CMAC can be ¢ o~~ implemented purely digitally or as a hybrid digital/analog system. 202

Speedy alternatives to back propagation

Embed Size (px)

Citation preview

Page 1: Speedy alternatives to back propagation

Speedy Alternatives to Back Propagation. John Mood v and Chris Darken. Departments of Computer Science and Physics, Yale University, P.O. Box 2158 Yale Station, New Haven, CT 06520 USA. Arpanet: [email protected]; Bitnet: [email protected].

We propose three new neuralty-inspired learning a l u m s which offer much greater speed and greater b io log~ plausibility than ~ Propagation. These algorithms include "Learning With Receptive Fields", a new Serf-Organizing Associative Memory, and a new variant of the Cerebellar Model Articulation Controller (CMAC). These new algorithms share one criticaJ feature in common: they utilize only one layer of intemal units. Furthermore, the Self-Organizing Associative Memory and the CMAC models require supervised learning of only the o u ~ t weights. These features result in increased speed.

In detailed comparisons to Back Propagation for the problem of p r e d ~ Chaotic Time Series, these new algorithms leam as much as one t h o u u ~ t tm~ fNter ~ i l e achieving comparable prediction capability on test data. Back Propagation, however, achieves its performance with a smaller set of training data. These algorithms are likely to provi~ similar speed increases in other problem domains.

The receptive field and seif-organ~ing models are in principle i ~ e n ~ as analog dynamical systems; we discuss the dynamics of such systems, The CMAC can be ¢ o ~ ~ implemented purely digitally or as a hybrid digital/analog system.

202