View
64
Download
0
Category
Tags:
Preview:
DESCRIPTION
Supervised Learning: Perceptrons and Backpropagation. Intro to Neural Networks. Neural Network ==. Connectionist /ism== Parallel Distributed Processing (PDP). Neural Networks assume . Intelligence is emergent. 1943 - McCullough Pitts Artificial Neuron. - PowerPoint PPT Presentation
Citation preview
Intro to Neural Networks
Supervised Learning: Perceptrons and Backpropagation
Neural Network ==
Connectionist /ism==Parallel Distributed Processing (PDP)
Neural Networks assume
Intelligence is emergent
1943 - McCullough Pitts Artificial Neuron
1943 - McCullough Pitts Artificial Neuron
Perceptron Learning 1958
Perceptron Learning 1958
Perceptron Learning 1958
Perceptron Learning 1958
Perceptron Learning 1958
Linear Seperability Problem 1965
Linear Seperability Problem 1965
Backpropagation
Used to train multilayer feedforward networks
Backpropagation
Backpropagation
Used to train multilayer feedforward networks
Assumes a continuous activation function
Backpropagation - Activation
Backpropagation
Used to train multilayer feedforward networks
Assumes a continuous activation function
Delta rule
Backpropagation Delta rulePerceptron update rule was:
Backprop update rule is:
€
Δw = c(desired − sign(actual))x
€
Δw = c(error)x
Backpropagation Delta ruleError of an output node:
€
error j = (1−output j2)(desired j − actual j )
Backpropagation Delta ruleError of a hidden node:
€
errori = (1−outputi2)( error j *wij
j∑ )
Backpropagation Delta rule
Backpropagation Delta rule
Backpropagation Delta rule
Backpropagation Delta rule
Backpropagation
demo
Inductive Bias
Inductive Bias
Encoding / Feature Extraction# neurons used# layers usedOutput mapping
Domains
Classification
Domains
ClassificationPattern Recognition
Domains
ClassificationPattern RecognitionContent Addressable Memory
Domains
ClassificationPattern RecognitionContent Addressable MemoryPrediction
Domains
ClassificationPattern RecognitionContent Addressable MemoryPredictionOptimization
Domains
ClassificationPattern RecognitionContent Addressable MemoryPredictionOptimizationFiltering
The good
Degrade gracefully
The good
Degrade gracefullySolve ill-defined problems
The good
Degrade gracefullySolve ill-defined problemsFlexible
The good
Degrade gracefullySolve ill-defined problemsFlexibleGeneralization
The bad
Time & Memory
The bad
Time & MemoryBlack box
The bad
Time & MemoryBlack boxTrial and Error
When not to use Feedforward net If you can draw a flow chart or
formula
When not to use Feedforward net If you can draw a flow chart or
formula If a piece of hardware or software
already exists that does what you want
When not to use Feedforward net If you can draw a flow chart or
formula If a piece of hardware or software
already exists that does what you want
If you want to functionality to evolve
When not to use Feedforward net If you can draw a flow chart or
formula If a piece of hardware or software
already exists that does what you want
If you want to functionality to evolvePrecise answers are required
When not to use Feedforward net If you can draw a flow chart or
formula If a piece of hardware or software
already exists that does what you want
If you want to functionality to evolvePrecise answers are requiredThe problem could be described in a
lookup table
When to use feedforward netYou can define a correct answer
When to use feedforward netYou can define a correct answerYou have a lot of training data with
examples of right and wrong answers
When to use feedforward netYou can define a correct answerYou have a lot of training data with
examples of right and wrong answers
You have lots of data but can’t figure how to map it to output
When to use feedforward netYou can define a correct answerYou have a lot of training data with
examples of right and wrong answers
You have lots of data but can’t figure how to map it to output
The problem is complex but solvable
When to use feedforward netYou can define a correct answerYou have a lot of training data with
examples of right and wrong answers
You have lots of data but can’t figure how to map it to output
The problem is complex but solvableThe solution is fuzzy or might change
slightly
Examples
Jonathan McCabe’sNervous States 2006Each pixel is the Output state of aNeural network givenDifferent inputs
Examples
2007 Phillip StearnsAANN: Artificial Analog Neural Network
Examples
Examples
Ted?
Recommended