Upload
christopher-sharkey
View
1.185
Download
0
Embed Size (px)
Citation preview
Neural NetworksChris Sharkey
today
@shark2900
Vs
Types of NeuronsS
Typ
es o
f N
euro
ns
S Linear
Binary Threshold
Rectifier
Sigmoid
Stochastic Binary
Simple neurons. Computationally limited.
Fixed output upon passing a threshold
Variable output upon passing a threshold
Outputs a smooth bounded function
Outputs a smooth bounded probability function
Linear Neuron• simple and consequently computationally limited
𝑦 = 𝑏 +
𝑖
𝑎
𝑥𝑖 𝑤𝑖𝑎
output
bias i th input
weight on i th input
sum of all incoming connections with each connection considered the activity on the input neuron multiplied by the weight on the line
Linear Neuron
𝑏 +
𝑖
𝑎
𝑥𝑖 𝑤𝑖𝑎
y
• plotting the output by the bias + the weighted activity on the input lines produces a straight line that travels thought the origin
Binary Threshold Neurons• computes a weighted sum of inputs
• sends out a fixed spike of activity if the weighted sum exceeds a threshold
z = 𝑖 𝑥𝑖 𝑤𝑖𝑎
y1 𝑖𝑓 𝑧 ≥ 𝜃
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
z = 𝑏 + 𝑖𝑎 𝑥𝑖 𝑤𝑖
𝑎
y1 𝑖𝑓 𝑧 ≥ 0
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
𝜃 = −𝑏
Binary Threshold Neurons
• binary output either a spike in activity or no activity
• spike is like a truth value
threshold weighted input
output
1
0 threshold
Rectifier Linear Neurons• zero as an output or no output until a threshold is passed
• when threshold is passed the output z is equivalent to the output y
z = 𝑏 + 𝑖 𝑥𝑖 𝑤𝑖𝑎
y𝑧 𝑖𝑓 𝑧 > 0
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Rectifier Linear Neurons
• allows for the nice properties of linear systems above zero and allows for decision making below at 0
y
z
0
Sigmoid Neurons• give a more real-valued output
• output is a smooth and bounded function of the total input
z = 𝑏 + 𝑖 𝑥𝑖 𝑤𝑖𝑎
y =1
1+𝑒−𝑧
Sigmoid Neurons
• nice derivatives of the curve exist
• nice derivatives are advantageous for easier learning algorithms
• (more details in next talk)
z
y.5
0
Stochastic Binary Neurons• same equation as sigmoid or logistic neurons
• treat the output of the logistics as the probability of producing a spike in a short window of time
y = 𝑏 + 𝑖 𝑥𝑖 𝑤𝑖𝑎
𝑝(𝑠 = 1) =1
1+𝑒−𝑧
Stochastic Binary Neurons
• use same as logistic units but are bounded by measures of probability
z
p.5
0
Question?
Perceptrons• first generation of neural networks
• good first example of a neural network
• binary threshold neurons
• trained binary neurons work as classifiers
• example of ability includes pattern recognition
• popularized by Frank Rosenblatt in the 1960’s 1
x1
x2
b
w1
w2
Perceptrons• learning procedure:
• add an extra component with value 1 to each input vector • this accounts for the bias values• pick training cases using any policy that ensures every training case will keep getting picked. To begin randomly assign weights then using an iterative method adjust the weights:
‣ if the output unit is correct do not change the weight
‣ if the output unit is incorrect and output is a 0 add the input vector to the weight vector ‣ if the output unit is incorrect and the output is a 1 subtract the input vector from the weight vector
• stop when the set of weights that correctly classifies all training cases are found
• assuming such set of weights exists
Perceptrons• Weight space
an input vector with correct answer (=1)
good weight vector
bad weight vector
bad weight vector
an input vector with correct answer (=1)
an input vector with correct answer (=1)
bad weight vectorgood weight
vector
an input vector with correct answer (=1)
Limitation of perceptronThe flaws and advantageous of perceptrons
What is next?types of network architectures
Thank you